Pull to refresh

What is it about IT. And when will it all end

Reading time 21 min
Views 1.3K
Original author: @igorsimdyanov

Yes, we need IT

The foundation of our civilization is tools. We didn't just evolve to a state that allowed us to use them, we began to improve them ourselves. The tools are becoming more intricate, more efficient and more perfect. It can be a hammer, an industrial robot, or a monetary relationship. 

Some of our tools are difficult to grasp or comprehend, they are more like an element or a subject of study: the Internet, the media, the transport system. It is even difficult to call them tools, rather it is a reflection of our activities. For simplicity, we will call everything that people do with their own hands a tool, meaning that they speed us up, make our life easier and more comfortable.

Why do we need tools? On the one hand, they help to solve emerging problems, on the other hand, they raise the standard of living. We enjoy creating tools. I would say it's one of our instincts. 

For the functioning of our body, we need some amount of mathematics. The processes in our head reflect this math into our language. We can write the language in the form of symbols. With the help of symbols, we can convey the discovery of one person to another or a thousand others. This allows us to build more and more complex tools. And most importantly, we really like it: the brain encourages us every time we invent or achieve something. Therefore, our tools are developing with us. The mechanisms of their development and improvement are sewn into us.

Migratory birds fly south in autumn, north in spring. It's their instinct. Ants build an anthill — it's their instinct. If people are left in satiety and peace, they begin to encode secret messages in the Bible, create complex etiquette, form mathematics, experiment with materials.

Steven Pinker in his book "Language as an Instinct" develops the ideas of Noam Chomsky that language is an instinct of our species. People form a language in two generations — it's sewn into the brain. The idea of tabula rasa (blank board) does not work: we have not invented our languages for millennia, it's just that after birth we can use language structures almost ready-made. For the first three years, a person turns on a "program" to adapt to the current language environment. The diversity of languages is due precisely to the fact that people cannot control the language. It is constantly changing, but this is not a problem for newborns, since the rules for using languages, the development program for all people is the same. The main feature of our species: we extracted evolution from DNA, because we learned how to encode and transmit information and transmit it not genetically, but in the form of symbols. We speak in tongues, we make tools, not because circumstances push us to do it, but because we can't help doing it. As migratory birds, they cannot but fly to the south in autumn, and to the north in spring.

But let's go back to the IT industry. Why do we need computers and what is it? For a very long time, the tools were static: they dug up the ground with a digging stick, they hit the game with a baton. All this required the lively participation of a man and his muscular strength. With the understanding of the nature of energy and its harnessing, man had tools that could work almost without his participation. The water mill, steam, combustion energy, and then electricity gave a completely different type of tools — machines. Now there is no need to gather a bunch of people in one place to build something big. Machines are much more productive and have more power compared to humans.

However, the machines do not adapt. Once built, the water hammer will rise and fall under the influence of the neighboring river. It will not work to turn it into a drilling machine for half an hour, even if it is very necessary. The answer to this inconvenience was computers — programmable machines whose behavior can be modified. Programmers are needed to change the behavior of machines.

Computer or software developers make up the workforce of the IT industry. When we talk about the demand and salaries of IT specialists, we are talking about developers of computers and programs for these computers. 

Instead of building houses, cooking nitrobenzene, working on a bulk carrier, or doing some other useful and paid occupation, developers are messing with computers. Why? Because there is a demand for their services.

If humanity finds other ways to cope with the problems that computers solve, the whole story of programming and IT will end, because there will be no one to pay for it and there is no need. Almost all developers will do other things that will be in demand by society. IT specialists cannot be a driving force by themselves, people need the result of their work. This desire to pay for computers and software generates a powerful IT industry.

Why do they have higher salaries than other engineers? Designing ships or electrical substations is no easier than creating web forms and serving CRUD requests. There is competition for IT personnel, there are not enough of them. And how did it happen that the right number of developers have not been trained for 70 years? There are enough engineers to design substations, but where do the developers go? It is unlikely that at a certain period of their lives they abruptly stop needing the money that the business has been pouring into the IT sphere for many decades. How long can such a period last and what will serve as signs of the transition of the IT industry into a normal, unheated engineering specialty? Let's try to answer these questions below.

Fuel for IT

We live in a unique time of virtually explosive development of the computer industry. Over the course of one generation, our technologies and languages have changed dramatically. The life of an IT specialist is like the life of a squirrel in a wheel. Acquired knowledge and experience quickly become obsolete: much faster than it happens in other professions.

It seems to me that the fuel with which the IT industry is moving has been Moore's law for a long time. Gordon Moore, the founder of Intel, formulated a rule of thumb in the 1970s. It says that every 18 months the number of transistors in integrated circuits doubles. It is usually displayed in logarithmic form:

If we display it not in a logarithmic scale, but honestly in pieces, then we will get an exponent.

Since the 70s of the last century, the IT industry has been experiencing exponential growth. That's how the explosion happens. The consequence of this is the explosive development of the entire computer industry. As soon as we begin to work out some technology, improve in it, we have more powerful computers, processors, video cards, and we need to switch to another technology — to use an approach that takes into account the increased capabilities. We are starting to develop a new technology, something else is happening: computers are being networked, the Internet appears, a new industry is being formed. It turns out that a lot of people use the Internet, whole countries fall through there. And now we already have big data, we need to do something with them, another industry for processing them appears. We can't cope with big data — we need to connect some mathematical models, neural networks, and now we are already forming the direction of artificial intelligence.

Here I have drawn an optimistic picture with straight lines of how people develop a single technology (assembler, high-level language, database). 

Here I have drawn an optimistic picture with straight lines of how people develop a single technology (assembler, high-level language, database). 

Moore's law is the engine of this explosive process, but it is from the side of hardware, computing capabilities. Human consciousness cannot develop exponentially. At best, it develops linearly: breakthroughs and discoveries develop very slowly, and they do not want to double every 18 months. In the IT industry, we have been forced to live in this revolutionary situation for decades. That is, the growing hardware is constantly changing the rules of the game in the industry. It's not just the language that is changing, the approach to how to develop programs is changing. 

This leads to the fact that concepts in programming languages change by leaps and bounds, much like Thomas Kuhn's in his "Structure of Scientific Revolutions". Only here there is no need to wait for generations of scientists operating in the old paradigm to die out. Here, iron "dies", or rather, becomes obsolete. To program new hardware, a younger generation of developers comes with an alternative approach, often with other programming languages and tools.

This hypothesis partly explains why there are not enough developers: focusing on old technologies, not everyone has time to quickly switch to new ones. There is a shortage in fresh industries, but young developers are closing it: they do not need to forget the old approaches, they can immediately start working with the latest technologies.

Therefore, the legislation could not drive many areas of IT under strict licenses. Such attempts have been made, however, breakthrough solutions that cancel the old ones are constantly coming to the market. If someone wants to prescribe in the law on which software and how to conduct this or that activity, the growth of hardware capabilities may lead to competitors solving people's problems more efficiently (replacing regular email). As a result, laws become stillborn. For example, right now there is a law that the delivery of letters is allowed only to the state. You can create a commercial cargo delivery service, but not letters in envelopes.

Milestones in IT

In order to predict the future, we need to know where we are now. Let's run through the main milestones to understand where we are.

Let's go back to the middle of the last century and see how programmers worked. You are handed a processor in which all commands are numbered. The combination of command numbers and addresses is programming in codes. In fact, such a program is a ready—made executable file that the processor can execute directly. For example, the 44th team adds, and the 23rd team subtracts. The memory cells are also numbered. So you start combining them, and you get a long string of numbers.

Programming in codes is quite tedious, people don't remember numbers well. In addition, with the release of a new processor, the command numbers change, and the program has to be compiled anew. Therefore, the developers created an assembler. The command codes were replaced with mnemonics — short letter abbreviations that are much easier to remember. 

Assembler made it possible to create long-running programs, such as operating systems. But if the processor architecture changed, the program could be thrown out, because it is very difficult to adapt it.

All efforts and work on programs are at the expense of a business that makes a profit. The task of commerce is to count money and optimize expenses. The first question that arises for business: "Why do we rewrite programs every time? Let's make it possible to transfer them from one computer to another." 

This is how high-level languages appeared. Of course, high-level languages did not arise as a requirement of commercial structures, but they got their spread largely thanks to business. With the help of high-level languages, you can write a program once. Further, regardless of what new processors you will have, this program will be portable. 

You can write operating systems, games, an office, and these programs begin to live for years. They grow in volume, it becomes difficult to understand them, and specialized languages become the next stage. The latter operate not with computer terms, but with domain entities. Instead of addresses, pointers and files, you can operate with a discount rate, tax, account, employee. It is much more convenient to make programs in the language of the subject area. 

The idea is this: why do we even speak the language of the computer, use files, variables, sockets? It is difficult. Let's create a language where there will be invoices, users, salaries at once - it's easier to express your business idea this way. There will be fewer bugs, we will need fewer programmers. 

Developing a specialized language is very expensive. It takes several years. Moreover, it is necessary to create a community around the language so that there is a talent pool, enthusiasts who will teach the language to other developers. Own language in general for all types of activities is an unrealistic task.

Therefore, the next stage was the creation of object-oriented programming languages. When you first build a domain language and then use it to solve an applied problem. For example, in the case of a game: first we create objects of the game world, and then we assemble the game itself from them.

If we translate from the game theme into the language of programmers, it turns out that we can first create any other language in the subject area within the framework of a universal object-oriented language and already develop the program we need in it.  

In the late 1980s and early 1990s, many languages were created. Now there are an order of magnitude fewer living programming languages. All this is thanks to object-oriented languages. We can learn Python or C++, and write anything in them.  

The process continues to this day: new languages are appearing and will continue to appear, because people are constantly changing their languages, not to mention computer ones. Especially if there is a reason to create new programming languages: for example, the emergence of multi-core processors or distributed systems.  

And what happens if Moore's law stops? Someone says he's already stopped. Someone thinks that this has not happened yet. Why did the problem of stopping Moore's law arise, and does it exist at all? Let's try to figure it out.

Stopping Moore's Law

Let's assume that Moore's law has stopped working. The IT industry has not had tens and hundreds of years, like other engineering specialties, to hone computer technologies, to make them perfect. The development of IT is caused not by the rise of human consciousness, but by the explosive growth of the microelectronic industry: mass production, miniaturization, price reduction. Human consciousness has time to rebuild with a creak. We have to give up old achievements without bringing them to perfection.

The IT industry is such a layer cake of poorly baked layers. There is still work and work to be done on each of the layers. 

Our programs consume an order of magnitude more memory and resources than they could. We have to exchange the development speed for excessive memory and PROCESSOR consumption. Moreover, Moore's law ensures the constant growth of their capabilities. 

If Moore's law stops, the IT industry continues to develop by inertia. Yes, there is no fuel to move up exponentially. But there is a gained speed and inertia of the industry. Most likely, we come out on an S-shaped curve, and at the end of it — on a plateau.

The S-shaped curve is the fate of almost all explosions. If something grows sharply, sooner or later it ends up reaching a plateau. Rebirth into an S-shaped curve is almost inevitable. The only question is where we are: within the framework of Moore's law, at the breaking point, or have we already passed the break?

There is no answer to this question yet. It's like with stock charts: looking through historical quotes, it's very easy to understand where the lows and highs are, where it was necessary to buy and where to sell. In reality, you have only the beginning of the chart and an unknown future, which is not easy to predict. According to some reports, the fracture occurred already in 2010, according to others, it will happen in the near future. The main question is: what awaits us there, in the future, when we begin to reach the plateau of the S-shaped curve?

Why Moore's Law should Stop

is a common joke: the number of predictors of Moore's law stopping doubles every 18 months. What does this army of predictors base their conclusions on? If you look at the chips under magnification, you can see something similar in the picture below. 

The enlarged picture shows the layers of the conductor, and the empty space is occupied by a dielectric.

The dielectric wall thickness in our chips reaches 5 nm, which is 50 angstroms. They promise 3 or 2 nm, that is, 20-30 angstroms. For comparison, the length of a water molecule is 1.5 angstroms. If we begin to reduce the dielectric wall, we will have quantum tunneling effects. Electrons will run where they are not supposed to. This does not mean that it is impossible to build processors on tunnel effects, but there are no fresh ideas for implementation in the next 18 months.

Moreover, it has become very difficult for us to make even modern processors. Copper, which is used as a conductor, begins to diffuse through the dielectric wall. Marriage appears.

Lithography gives at best a resolution of 150 nm. And you need 3 or 2 nm.

Lithography technology is used to create a drawing of a chip on a substrate. The negative of the chip layer is applied to a transparent film through which radiation, for example, visible or ultraviolet light, shines on the substrate. The bonds of the dielectric are destroyed, and if you treat it with acid, you can burn out the chip set on the negative in it. Then another layer is applied, and the operation is repeated until the conceived chip is ready. 

Visible light starts at 200 nanometers. If we use ultraviolet, then it is 100 nanometers. For X-rays and gamma radiation, it is unlikely that you will be able to pick up the template material. Yes, and it's not easy to work with them. An additional nuisance is that light is a wave — after a small drawing, it still interferes with a half—wave. That is, in reality, the standard sizes that can be achieved with such a wavelength are even smaller. And we need a resolution of 3 or even 2 nanometers.

How do they get out? Instead of shining light, they begin to run around the board with a beam of electrons. The problem is that it takes time. If vibrations occur at this moment, the beam can draw the entire board and make a marriage.

There is a monstrous amount of marriage in the modern technological process. The yield is only 5%. For example, when you buy a processor with 4 cores, there may be 16 cores in it, just 12 of them are defective and disabled.  

The maximum standard sizes are created using an electron beam. 

To reduce the number of defects, they go to various tricks: they place factories in deserts and swamps, rebuild aircraft routes to divert them from production, the technological cycle is coordinated with the movement of trains, and lithographic installations are made floating in a pool with oil.

In any case, it can be stated that we are approaching the limit of our technology. Even if we can move on, this movement ceases to be economically profitable. You probably noticed that instead of increasing the frequency of the processor, other processors are now being put next to it, multi-core processors are being made.

New principles and approaches are needed, within which Moore's law is unlikely to operate. There will be a different development speed. Given that the entire economy of the planet is not involved in this process, it is not a fact that it will be exponential.  

It seems to me that Moore's law, if not stopped, is very close to it. That is, we are somewhere on an S-shaped curve and can fantasize about what awaits us when Moore's law stops moving.

What's next: Can we predict how IT will change

Of course, we are far from having exhausted the possibilities of current technologies. Is it impossible to increase the frequency (speed) of the processor by reducing the dielectric wall? It is possible to place several cores side by side on a crystal. Are the cores too big and can't accommodate a lot? You can simplify them (RISC), specialize them, as in the case of video cards that already host thousands of cores. 

I think we will see desktop solutions with tens and hundreds of thousands of cores. Therefore, exponential growth does not immediately hit the ceiling. Most likely, it will continue in the form of a slow ascent until we reach the plateau.

We can try to predict how we will live in the world after Moore's law stops, relying on engineering fields that took off in the same way: steam, electricity, aircraft construction, space, construction. Most of them have reached a plateau and continue to develop steadily, some have reached a peak or even degraded (steam, partly space).

It is possible to understand whether we are reaching a plateau or acting within the framework of Moore's law by a number of signs.

Catching up the gap between hardware and software parts

After the expiration of Moore's law, we are still waiting for a period of reduction between the hardware and software. If you now write a regular calculator for Windows in assembler, its size can be increased to 14 KB (I tried). Yes, of course, it will not be so beautiful, and the graphics will consume memory, plus we rely on Windows libraries. But the calculator now takes 280–500Kb. This is 20-40 times more than it could be. Moreover, the Windows calculator is a small, optimized program. 

The craze for packages and package managers leads to a gigantic volume of all software. Any developer can give similar examples: a lot of npm packages that often perform duplicate functions in the project and bring dependencies, Ruby on Rails projects that are cramped in 2 GB of memory, even if they display a landing page. 

We have a lot to optimize the software: to "finish" the layers of the pie, which did not have time to prepare during the work of Moore's law. This situation has developed because developer time is more expensive than memory. We spent a very long time exchanging memory and processor for development speed. 

If you notice that more and more mathematics or algorithms are required at work, if you are thinking about not increasing, but reducing the number of servers, perhaps this is one of the signs of stopping Moore's law.

Typical projects

If you look at the neighboring engineering specialties, we are very likely waiting for standard projects, when they tell you from the school bench which databases to use, how to program, here are ready-made libraries. It already exists, but not in this form. For example, we in Netology can come and say: "So, we need MongoDB, let's try it. Well, it didn't work out, we'll remove it." In the case of standard projects, it will no longer be possible to do this, almost at the legislative level, at the level of the ministry it will be prescribed: we in the industry use such and such a database, nothing more. That is, we are waiting for not just well-known algorithms or patterns, but directly certified databases, servers, software, without the use of which it will be impossible to sell the program.

Gender alignment

Despite the fact that Ada Lovelace is considered to be the first programmer, after whom the programming language "Ada" is named, until recently there was a big bias in the IT industry towards the male side.

The fact is that nature experiments on males. It is believed that even deviations in the genotype of men occur more often than in the genotype of women. Women need to be protected — they bear offspring, and men go to kill a mammoth — they will be trampled or they will kill each other on the way. Here we will experiment on the latter.

As a result, it has evolved that men are more prone to risk, and girls are more systematized, more careful and cautious. Even the statistics of investing money says that men are prone to riskier investments (stocks) than girls (bonds, deposits).

That is, the male population is more prone to adventurous areas than the female. Most likely, this explains the small number of girls in IT during the period of Moore's law: when the acquired skills and knowledge were reset too quickly, and the rules of the game changed too often.

Those of you who worked in the industry 20 years ago may recall that there were practically no girls among programmers. I noticed that recently I started hiring 50-50 girls and guys. In particular, the analysis of this change prompted me to create this article. For some reason, it seems stubbornly that this is one of the signs of passing the breaking point. Since calm software design according to well-prescribed rules is more suitable for neat girls than aggressive guys.

Growth points

What happens if we still have Moore's law stops, and we come out on an S-shaped curve? Life will not stop, money will not go anywhere, business will continue to be creative and look for application points. For example, such as:

  • space;

  • thermonuclear energy;

  • biotechnologies;

  • quantum computers.

I am sure that there are still points of growth, but either I personally do not see them, or only a few in the scientific field suspect this so far.


Space has long been considered as a source of endless expansion of mankind, including economic growth and expansion. Gigantic budgets were allocated for the space race, the best personnel were attracted. Exactly until it became clear that the Solar system consists of the same thing that the Earth is made of. There is no economic benefit from space flights. 

We have already said that the driving force of engineering is the economy. People want to raise their standard of living. But on the idea that it just needs to be done in order to do it, you won't last long. Interest will be lost, and people will do what benefits, improves life right here and now.

This is a big problem that is being solved almost artificially: they create an unprecedented grouping of satellites for distributing the Internet, they extract 3He (helium-3), from which we still do not really know how to extract energy, they form a niche for space tourism. One gets the impression: you need to come up with anything, just not to lose the technologies that were obtained during the space race.

Potentially, there is still hope for space as a source of another engineering explosion. From a human perspective, these are almost infinite spaces and resources. However, "infinite" distances do not allow you to use the results of investments during your lifetime. The latter does not allow many people to be involved in the space economy. If in the IT industry we are talking about an S-shaped curve with access to a plateau, then in space we can say that there is even some regression.

Thermonuclear energy

Thermonuclear energy can increase the amount of electricity generated and at the same time reduce its cost by orders of magnitude. This is very similar to the working conditions of Moore's law and can bring humanity to a new level of development.

The source of energy in atomic and nuclear power is the conversion of matter into energy according to the famous Einstein formula:

All other things being equal, as much as the mass of the source fuel will decrease, so much energy will you be able to extract from this process. The speed of light is a very large quantity, so the destruction of even a small volume of matter leads to the release of a large amount of energy.

At the moment, we burn uranium 235U in nuclear power plants. The content of this isotope in natural uranium is 0.7%, the rest is 238U, from which tank armor and shell cores are made as unnecessary. There is not enough isotope 235U on the market, which is also why the number of new nuclear power plants is not large — there is not enough uranium for all of them. The situation is slightly saved by the production of plutonium during the decay of 235U. It can be used not only to create nuclear weapons, but also as fuel in nuclear power plants.

The transition to fast neutron reactors will make it possible to use the entire mass of natural uranium, including accumulated over previous years. This project is called a "closed nuclear fuel cycle" and is currently being implemented only in the Russian Federation. All other countries have frozen the development of fast neutron reactors. In no small part because of the disasters at nuclear power plants and the complexity of eliminating the consequences. Fast neutron reactors assume a much higher temperature in the reactor, and ordinary water in the first cooling circuit is not suitable. Until recently, they used an alloy of potassium and sodium as a cooler. Given that they actively interact with water, an accident at such an object is likely to end in a grandiose explosion. Therefore, it is planned to use lead as a coolant. 

In fact, at the moment we can extract heavy atoms and get energy from their decay, which proceeds with the loss of mass. We are surrounded mainly by atoms with a small mass, and it is much more promising to conduct a nuclear reaction between two light nuclei and obtain heavier atoms. If such a reaction is carried out with a loss of mass, we can again get a lot of energy. This process has been going on for billions of years in stars, including the Sun, where a thermonuclear reaction of hydrogen into helium takes place.

In theory, the thermonuclear reaction can go further, up to the formation of iron atoms, the most stable element in our universe. At high energies, the formation of heavier nuclei is also possible when the nuclear reaction proceeds with the absorption of excess energy. This is how the substance that makes up our planet was formed, including a small amount of uranium, which we extract and burn in nuclear power plants. It is believed that this substance was formed as a result of a supernova explosion and then captured by the Sun.

Nuclear fusion reactions are constantly going on in nature. It remains only to reproduce them, simultaneously benefiting people. Why are we still not doing this? The problem is the monstrous temperatures at which they flow: no substance in our conditions can withstand them. Therefore, the plasma, where the nuclear fusion reaction takes place, has to be kept in electromagnetic fields, for the maintenance of which electricity is spent. So far, we are expending more energy than we can extract. In order to benefit, it is necessary that it be the other way around. Moreover, we have been trying to do this for 70 years. If it works out, civilization will jump to a new level of development. It may even be possible to shift the space program.


The potential of this area is clear to almost everyone, because we ourselves are a product of biotechnology. Technologies created by humans are tens of thousands of years old, biotechnologies are billions. From the point of view of modern science, they are honed to perfection, since they had time for it. 

The efficiency of biological processes reaches 80-100%, while our internal combustion engines reach only 40%. We are still very far from reactions that proceed with 100% selectivity. We can only use enzymatic catalysis, we cannot yet create catalysts of this class. 

Every first science fiction writer tells about how the world with biotechnologies could look like: Stanislav Lem, Harry Garrison, William Gibson, Sergey Lukyanenko. The list can be continued indefinitely. The topic of biotechnology is often raised in computer games: Fallout, Deus Ex, Cyberpunk 2077. I am sure that the full list here is also very impressive. Almost no one has any doubts about the enormous potential of technologies developing over time, which humanity does not have. If successful here, the analogue of Moore's law could have been in effect much longer. Humanity would have reached a new standard of living, economy, new professions and entertainment would have appeared.

Quantum computers

There has been a lot of talk about quantum computers lately. In fact, it is an analog computer that allows you to simulate quantum processes. When it comes to quantum computers, cryptography is often mentioned, since quantum algorithms can reduce the cryptographic strength of existing encryption algorithms. However, they were not created for these tasks at all.

To immerse yourself in them, you can turn to an analogy. There is a three-body problem in astronomy. We can analytically, in formulas, express the gravitational interaction of two massive bodies, absolutely accurately predict the trajectory of their movement. If a third body is added to the system, then the problem is not solved analytically, it will have to be solved numerically.

In quantum mechanics, we can analytically solve problems for systems with a single particle. In the case of several particles, the Schrodinger equation is solved only numerically, approximately. This is not a problem when there are few particles: for example, when calculating the first three periods of the periodic table without taking into account relativistic effects and the fact that atomic nuclei behave like quantum objects. The accuracy of calculations suffers, but more or less struggles with experimental data. If we go down the periodic table, the atoms of the elements become overgrown with a "fur coat" of electrons. Our approximate models are crumbling. There are not just a lot of electrons in a complex molecule, they are also generalized. It's the same story in metal. In addition, in reality we have not one or two atoms, but about 1023.

Modeling systems with a lot of quantum particles turns into an impossible task. We don't have a suitable model and mathematics. We have been looking for them for about 100 years and we cannot fundamentally move. This task is needed, among other things, for a breakthrough in quantum electrodynamics for the design of subnanometer-level microcircuits. 

The creation of quantum computers is an attempt to shift the problem in an actually analog way. If we get special functions from quantum mechanics that we can calculate as easily as a sine or cosine, we will advance in cognition and accurate modeling in the quantum world. Now we easily calculate the sine of the angle in a calculator or phone, without even thinking about what is behind it. And there are dozens of dissertations behind it.

For a long time I refused to consider quantum computers as computers — for me it is a physical device like an NMR spectrometer. But still, it is a programmable machine, which means a computer. It seems to me that their potential is huge, like the mass introduction of electricity or the first integrated circuit. 

A calculator was created based on the first Intel processor. No one then saw the huge mass market of personal computers, and what it resulted in in the XXI century. Similarly, even now we do not know what a higher speed of modeling quantum systems will give us: new catalysts, medicines, implants, economics, entertainment and professions. But this is definitely a new standard of living for all people on earth, a new leap in the scientific and technological revolution. This is a new explosion.


Any of the potential growth points described above can become a new fuel for exponential growth, similar to IT. By the way, money and personnel from IT will go there. Therefore, it is important to understand where you are and what the dynamics of the industry development are ahead of you. 

High salaries in IT, pleasant educated colleagues, the attention of the whole world is not an endless Klondike. It so happened that we are now near the point of maximum speed of the "computer explosion". The fuel of this explosion has been working since the middle of the last century and with a fairly high probability ended in 2010. Give or take another 30-60 years, the IT sphere will grow and develop, then it will reach a plateau and become a regular engineering industry. 

Those who study to be an IT specialist have time to take place in the profession. But it is no longer necessary to run away from mathematics — there will be more and more of it in the industry. In any case, there are even more amazing and powerful engineering tasks and industries nearby. When will these sleeping volcanoes wake up? Is unknown.

Total votes 7: ↑6 and ↓1 +5
Comments 0
Comments Leave a comment