Archive by Author | rezananoha

The next generation of computers: Bio computers and Quantum computers

Moore’s law will break down
In 1965 Intel’s co-founder Gordon Moore made a prediction about the power of computer chips. Moore predicted that the power of computer chips would double every 18 months. This prediction became known as Moore’s law, and has been inspiring Intel’s innovations. Up till this day Moore’s law has ever since been correct. However, scientist believe that maintaining Moore’s law will become quite difficult. They predict that the speed at which computer power is currently being developed, will gradually decline by the end of this decade. Over the years chip producers have been cramming more and more transistors on a single computer chip. As a result that they have reduced the production process of chips to a microscopic scale. But the increasingly smaller production process brings problems. One major problem with the current silicon technology would be heat generation. A problem which becomes bigger with the increasingly smaller production process.


Alternatives to silicon chips
At the moment Intel and its competitors are maintaining Moore’s law by use of multicore chips to achieve parallel computing. But even Intel has admitted that silicon technology eventually will come to an end. Therefore a number of alternative computer technologies have been proposed. Scientists have proposed optical computers, bio computers, and even molecular computers. Because of massive parallel computing, a bio computer would only require several hours to perform analysis on a huge amount of information. This far surpasses the centuries it would take for our conventional computers .


The power of quantum computing
In the long run there is a good possibility that we will see quantum computers. Quantum computers can truly be seen as the ultimate computers. They would probably mark a leap forward in computing power far greater than the transition from the very first calculator to a modern day supercomputer. The potential power of quantum computers truly is mind boggling. A quantum computer basically is a massive parallel processing machine. Which means that it can work on millions of calculations simultaneously. Whereas a conventional computer works on one calculation at a time in sequence. However, don’t expect to see a quantum computer anytime soon.

The next generation Internet Protocol: IPv6 the enabler of a larger internet


The last days of IPv4
When the internet was originally designed in 1970, their founders seemed that is would be highly unlikely that the IP address space would become an issue overtime. Because with the current IP address scheme (also known as IPv4) it is possible to allocate almost 4.3 billion IP addresses. Which seemed to be more than enough space at that time. But despite this incredible number they could not have overseen the massive growth of the internet in the late 90s. During the last decade the number of Internet users has grown even more (see image below).


To anticipate on these space issues of IPv4, the IPv6 scheme was introduced in 1996 by the Internet Engineering Task Force (IETF). IPv6 will be able to hold an impressive amount of: 340,282,366,920,938,000,000,000,000,000,000,000,000 unique IP addresses. However, IPv4 is still operational and used to facilitate the internet today. Actually the last block of IPv4 addresses space has been allocated in February 2011.

Transition from IPv4 to IPv6
This year an actual transition has been set in motion to migrate the internet protocol from IPv4 to IPv6. But the problem is that over the years the internet has become quite complex, and it is challenging to coordinate such a migration as it involves governments, enterprises, manufacturers, internet service  providers (ISPs) and even individuals. It will require a collaborative effort from many users. To encourage this, an initiative was launched on June 6th which announced the beginning of the official transition of internet to IPv6: However, the actual transition will probably take several years, because many governments and enterprises are still hesitant to invest in their network infrastructure to make it IPv6 compatible. China and other connected Asian countries, have heavily invested in IPv6 deployment, and European companies transacting with Chinese businesses in that region will probably need to head the line of IPv6 transition in this country.

The future is IPv6
At any rate, the transition from Ipv4 to IPv6 is essential for the growth of the internet. If the transition does not happen in the near future, newly produced internet devices which will not be able to connect to the internet because of the lack of available IPv4 addresses. On the other hand, when this transition has been completed, techniques like NAT (Network Address Translation) would become obsolete. Also, smart phones, digital cameras, cars, refrigerators, microwaves, TVs, and many more devices will be able to seamlessly communicate which each other in the future.

The next generation of the World Wide Web is near: Web 3.0 and the Semantic Web

Tim Berners-Lee, inventor and founding father of the World Wide Web has described the Semantic Web to be part of Web 3.0. Several years ago, Web 2.0 revolutionized how we use the web today. Technologies like AJAX and XML have contributed to the rich and interactive user experience we have today like on Facebook, Twitter, Blogs, Google etc. Web 3.0 is going to unleash a revolution at least equally amazing. Especially the Semantic Web is a promising development, and worth to keep an eye on.

Semantic Web: Data with meaning
The Internet consists roughly about an impressive 5 million terabytes of data, and is rapidly growing each day. The largest portion of it consists of unstructured data. Computers don’t understand the meaning of this unstructured data. That is where the challenge lies for the Semantic Web. With Semantic Web technologies like RDF and OWL, the idea is to give this unstructured data meaning. What this actually does is that a computer can interpret and analyze data just like humans do. For instance, a computer could understand the following sentence “Amsterdam is the capital city of the Netherlands”.

What are the benefits over the current Web 2.0?
The roots the Semantic Web technology are in the area of artificial intelligence. One of the benefits of this could be intelligent web applications (e.g. intelligent agents or intelligent search engines). These applications can collect information from many different sources. But can also combine this information, and present it to users in a meaningful way. A search query like “What is the capital city of the Netherlands?” would then result in the answer “Amsterdam”. The accuracy of search results would be drastically improved compared to the results we get today.

So, when can we expect this technology to emerge?
RDF and other technologies to implement the Semantic Web have been around for several years now. However, how ambitious it may be, creating a complete Semantic Web may not be as easy as it seems. It will probably take some time and collaborative effort to enrich the major part of the Internet’s unstructured data with semantic data.