(hey, type here for great stuff)

access to tools for the beginning of infinity

Comparing traditional computing versus the cloud

It may not be the personal computing giants Microsoft and Apple who change the industry, but its users. The future of the Internet and computing looks much more open, affordable and efficient.

Information is being increasingly created and shared online. The computer is only a means of access to Web 2.0. Information is migrating online. Or, at least, those products of leisure and productivity related to Web 2.0.

This article was written on a notepad of free notes, subsequently published in Google Docs and then published in faircompanies, a website in which the users do not only consult information, but they comment, recommend and publish their own contents.

I also use Zoho Office Suite, a free product which is accessed -not downloaded- and is capable of functioning even off-line: Zoho has integrated in its package of online programs the tool Google Gears, designed to work with Internet programs when there is no connection and, when the computer recovers the signal, it downloads all the information stored locally to an online version.

Google Gears has an Application Programming Interface (API) that can be used by anyone to continue using an online service off-line, even though it’s about companies that, like Zoho, compete in the same market as those who offer the tool. To arrive at this collaborative point, the philosophy of these companies must have changed.

The blog that I update weekly and the photos that I share with family, friends and colleagues are products that only live on the Internet. This situation, rather than creating more anxiety, instills security in me. None of these applications are installed on my computer.

Being able to access at any moment and from anywhere, with a cellphone, a laptop, any computer or device with Internet access is a big step from all those documents we dragged along in our digital pocket not so long ago.

With Web 2.0, information can be created, consumed and distributed on the Internet, and in most cases without paying a dollar to do so. No one should worry too much about the philosophical or authentic meaning of “Web 2.0” (for Tim Berners-Lee, for example, nobody really knows what it means).

Still, we all continue to need a device to access these services, and also a navigation program. And Web 2.0 (referring to the services that we use daily and that store our information online, instead of making a copy on our desktop), has not yet arrived at personal computing, to put it one way.

The era of ubiquitous access to contents

The expression Web 2.0 has been used to refer to services in which the user does not only consume and interact, but also produces and distributes his own contents.

The border between professional contents and services and those created by amateurs is increasingly more diffuse thanks to Internet and to his open architecture and oriented to the contribution.

Videos, documents, photographs, technical products, music, exchange of tastes and inclinations through social networks, encyclopedic contents.

Some blogs specialize in disseminating real-time information with notable quality. TechCrunch, the blog specialized in technology and web 2.0, has surpassed the audience of CNet Networks since December of 2007, according to comScore.

CNet Networks is the information technology network that has led this sector practically since the beginning of the Internet as global phenomenon. That a blog like TechCrunch, until recently supported by little more than the contents of its founder, Michael Arrington, an “insider” of Silicon Valley who speaks on and for an audience most specialized in the new businesses of Web 2.0, can achieve more visits than CNet, a corporation listed on the Nasdaq index and that has (as of January of 2008) 2,600 employees, is at the least surprising.

Blogs, videos, photos, social networks

Some investors don’t like this role reversal in technology news, where using a familiar tone and having some quality sources, although the content is done from home, attracts more people. According to The New York Times, investors want to buy the business to take control of the situation.

More content is created and distributed than ever. Much of this is seen as “noise” or trash content. In the same way that a reasonable part of these blogs, videos, photographs, etc., now compete in quality with professional content.

There are increasingly more users consuming, sharing, creating and publishing content on the Internet, thanks to the expansion of bandwidth and the wireless Internet. If it has increased the total volume of information stored and the traffic related to its creation and consultation, where will all this content be stored?

Businesses like Google, the biggest advocate of the new generation of Web 2.0 services by its size and market share worldwide, have the answer: there are increasingly more powerful data processing centers.

Who pays for all this?

Gartner explains that the energy needs of these data farms, composed of hundreds or thousands of servers connected to the network, now spend a quarter of the carbon dioxide emissions of all the information technologies industry.

The necessary energy consumption to maintain the servers in operation and climatize the data centers is the fastest growing emission part of the technological industry.

A study from the University of California at Berkeley that measured the expense of the servers in the United States corroborated that, in 2005, the energy consumption of all the American servers reached 45,000 million kilowatt-hours, an amount equal to more than that used annually by 19 American States.

A study by Forrester Research describes the environmental footprint of personal computing and its associated services: by the end of 2008, there will be more than 1 billion personal computers in use worldwide; 2 billion by 2015. Brazil, Russia, India and China themselves will reach 775 million computers by 2015.

Servers, data processing centers, personal computers and cloud computing

So the so-called Web 2.0 and its associated contents don’t end up contributing increasingly more to climatic change, the data of Gartner and Forrester corroborate that the Internet has to evolve in more areas, not just in that of innovation of its services:

  • Data centers should be much more efficient and even self-sufficient thanks to the use of renewable energies and their installation in temperate climates, to avoid an excessive use of air conditioning. To improve the energy efficiency of the servers and the centers that store them, the Climate Savers Computing Initiative, an organization created by several businesses -Google and Intel among them- in 2007, aims to congregate businesses, consumers and NGOs that worry about the environmental impact of the technology. Google assures they have improved the energy efficiency of their servers thanks to the specific design of connections that do not reheat the equipment and do not remove energy that is lost. Another initiative, called Members of the Green Grid Association, composed by a group of technological businesses, users of data processing centers and consumers, aims to improve the energy efficiency of American data centers by 10%. It doesn’t sound like a spectacular advance. What these businesses are trying to do, according to a critical reading of this initiative, is to self regulate to avoid laws doing so and future regulations.
  • The same computers, that in 2015 will be present in many homes in emerging countries, have to offer the educational and technological experience promised, with the smallest possible energy consumption. Personal computers waste, as stated by Urs Hölze, of Google, to News.com, 50% of the energy consumed, which is lost in the form of heat. Paradoxically, this heat is not only energy wasted, but it in turn heats the processor and other components of the equipment, necessitating fans for cooling that, at the same time, expend energy. A vicious circle which shows that the connections to the current and the transformers of electronic and data processing equipment have not evolved much in recent decades.

Cloud computing against energy waste

It is called cloud computing and it enhances the energy efficiency of websites, data processing centers and Internet infrastructures in general.

Until the popularization of this new technique, storing a product or remote service required having a computer or server, or various at the same time, connected to the Internet. This infrastructure always requires the same energy and bandwidth, since it is preconfigured.

Often, consumption is more than that required by the traffic generated: with this model, more than the necessary is consumed to avoid failures in site traffic when there is a spike in connections, etc.

Cloud computing uses an infrastructure composed of groups of ten, hundreds or thousands of computers connected between themselves and, at the same time, to the Internet. A service or product of the Internet constitutes an “object” that is included in the total infrastructure.

This type of service has as much space and bandwidth as necessary in every single moment. The size of the infrastructure and the flexibility of its design do not only enhance energy efficiency, but dramatically diminish the prices of storage and connection. The ideal scenario for economies of scale.

Personal data processing: Is there life beyond Microsoft and Apple?

Is there something similar to cloud computing in personal data processing? For the time being, no. Drawing an analogy between the PC and the auto industry, many of us use equipment that wasn’t designed thinking about our daily needs, but in a traditional data processing model founded by Apple and transferred to the masses by Microsoft: an operating system in which the tasks are subordinated to the purchase of software products.

We use a car that is quite large and ugly, in some cases; in others, an attractive sports car to which we cannot change the oil without asking permission of the owner of the trademark and without a trip to the cash register. I believe that the metaphor has been understood regarding who is who here.

The copyright of these programs is controlled by the commercializing company, which extends to the user the right to use the product. A document in the format of Office 2007 cannot be opened by previous versions of the software (Tim O’ Reilly, among others, predicts the end of the culture of launching closed software versions that don’t evolve until the subsequent launch), in the same way that an iPod cannot be used with a music download application that is not iTunes.

Microsoft decides in every moment how the users see, save and interact with their documents. Apple does the same with its services and products.

This model, protected, closed and with an evolution controlled by apparently “critical” or “necessary” updates, continues dominating personal data processing in an overwhelming way: Microsoft Windows, Apple Mac OS X and Microsoft Office are some of the main examples.

These software products work under a traditional model of commercialization, based on control of the market and the imposition on the user of updates for which she must pay, direct or indirectly.

Apple is now asking the record industry, on the one hand, to eliminate their systems of electronic copyright protection, or DRM, because they believe this will expand the legal purchase of music online. While Steve Jobs fights for the downloading of songs without copyright protection, he maintains his own software with a license that prevents its improvement, personalization or adaptation on the part of third businesses or individuals.


Alternatives? That depends. Patience is required; it’s not as easy and comfortable as buying a computer with the security of knowing its operating system; one must investigate and dedicate time, although not more than that which many of us use when we want to investigate Mac OS X.

Before the purchase of a new computer, the question in the air seems to be whether to choose one with Windows Vista or to reinstall Windows XP, while the radical change is to move beyond these versions to Mac OS X Leopard.

Many users are afraid that their photo camera, or printer, or music player, will stop working if they don’t use Microsoft Windows Vista or Mac OS X Leopard. But there are several viable alternatives to this market, all based on an architecture with open source: Linux.

Most of them include the necessary support to avoid problems and incompatibilities with the peripherals of the main brands. Yes, to adopt Linux is not as comfortable as to remain with the two predominant options, the choice of the overwhelmingly majority (Microsoft Windows) and its alternative of trendy design (Mac OS X).

It’s not the same thing with the Internet services and infrastructure, where open source software is gaining ground and businesses don’t think twice, above all among the providers of Web 2.0 services: Linux, in its numerous versions, is the fastest growing platform, to the detriment of systems like Microsoft .Net framework.

All the “killer applications“, or really successful programs with disruptive technology capable of transforming or creating a new market, have been created and work in Linux or BSD Unix. Among them, the applications and services of Google, Amazon, or Yahoo! Maps.

Cloud computing favors this tendency, since businesses like Google or Amazon (the online store headquartered in Seattle is the first company marketing cloud computing services to the general public, by means of the product Amazon Elastic Compute Cloud), rely on flexible compute cloud infrastructures.

Google and company: the innovation itself moves to the Web

Google, Amazon, IBM or Yahoo! have transformed their data processing centers into cloud computing farms, where an application or an Internet website is an element occupying some of the space needed to function at full performance: when there is little traffic, the owner of the stored object (whether it refers to an application or a site) or software pays less for this hosting than for his Internet connection.

The price is likewise competitive when popularity arrives by surprise and it is necessary to deal with much greater traffic at peak moments. Phrases like “server down” or “updating the server” can become history, if the new model is universalized, which for the time being is only marketed by Amazon.

BusinessWeek‘s article The Wisdom of Clouds explains that the economic and efficient storage of this new model will be offered to the general public by Google and, therefore, by various competitors.

Google’s cloud is not a distant and ethereal idea as its vague denomination implies. Google’s cloud is, according to the article of BusinessWeek, “a network made of hundreds of thousands, or by some estimates 1 million, cheap servers, each not much more powerful than the PCs we have in our homes.”

It is the already existing infrastructure of Google. What this Mountain View business, or Amazon, are doing is to offer the surplus of their extraordinary infrastructure so that users and businesses can store data.

An internal document from Google cited by Erick Schonfeld, from TechCrunch, claims that the business processes 20,000 terabytes of data daily (or 20 petabytes), an extraordinary quantity of information that no other firm or institution can presume to generate.

It’s necessary capacity to index the Internet, process search results, maintain and offer advertising, as well as the necessary infrastructure for services like Gmail, the photo albums of Picasa, Google Apps, Google Docs, Google Analytics, Google Reader and a growing etcetera.

Yahoo!, the owner of Flickr, the social network Facebook and hundreds of other businesses invest huge resources in improving the efficiency of their data processing centers to reduce the energy bill. And while at it, emissions are avoided.

The latest study by Pew Internet reinforces the strategy of these businesses: they are going to need infrastructure to store all the information generated. The study claims that 59% of American adolescents create content on the Internet:

  • 39% of adolescents share their creations online -photos, videos, texts, etc-.
  • 33% create or work on web pages, blogs or social networking pages.
  • 28% have created their own blog.
  • 27% maintain their own personal web page.
  • 26% use Internet content in their own creations. On many occasions, legally (thanks to initiatives like Creative Commons, supported by websites and software firms).

When the Internet is the point of access and storage for our contents

The article in which Tim O’ Reilly defines what is this new wave of services, baptized by him as Web 2.0, is still applicable, even though it was last updated in September of 2005. The article defines the Web as a platform, in which we consume and generate information.

Millions of professionals who work daily with the Internet, are comfortable with new products and technologies and have a certain technical level, could be performing their work with an XO, better known as the “100 dollar laptop”, created by One Laptop Per Child (OLPC).

This computer, that has a real price of 200 dollars, has wireless connection, an Internet browser compatible with any online service or application, RSS reader, basic text and image editors, as well as other applications created from the point of view of the user, and not of the application.

When contents, services and applications are transferred from our desktop to the Internet, our personal computer is no more than the means to access to the information that we create, consult or share. The power of the computer is dispensable, provided that the Internet connection is good.

The user controls the contents

As Tim O’Reilly explained in 2005, much before the arrival of Google Docs, Zoho Office Suite and other services that allow us to work and collaborate online, in the new Internet services paradigm the user controls his/her own information, which is located at the center of the chart.

O’Reilly also speaks of services instead of software packages, architecture of participation, efficient scalability with adjusted prices, sources of information that can be mixed and transformed, software above the level of a simple device (ubiquity: we access the services from a laptop, a cellphone or a device like the Nokia N800 Internet Tablet), taking advantage of collective intelligence (signified by tag clouds, like in faircompanies, etc).

In an interview granted to the magazine Wired for the January 2008 edition, the technology journalist Nicholas Carr, ex-executive director of Harvard Business Review, spoke about his last book, The Big Switch, and what is called “World Wide Computing”.

Carr, is dedicated to dissecting the phenomenon of server farms and networks that maintain “data clouds” in constant operation. For Carr, this evolution is not all positive, if it is basically controlled by Google and Microsoft.

To avoid the oligopoly and the control of the information by a handful of businesses, the theory of The Long Tail (an article and a book, with an identical title, of the journalist and director of Wired Chris Anderson) explains how minority services gain ground on the Internet, not just majority tastes (as with traditional media).

The Long Tail talks about minority consumption, as well as use and production of minority contents. When given more options- as with the nearly unlimited choices online vs the more limited space in bookstores or cinemas and on radio or television- users will stop using services that don’t give them the results they want, even if they are products from Google or Microsoft.

When users of Digg, for example, are in disagreement over some action of the company, some take measures to show their disagreement; others, simply switch their service.

In a participatory architecture, the user has space and a voice (at times, a vote) to complain. And to leave for a service of the competitor. The user comfortable with new applications online is prepared to substitute one free service for another, if the new one is more efficient, attractive, etc.

Carr, in the interview with Wired:

“People say they’re nervous about storing personal info online, but they do it all the time, sacrificing privacy to save time and money. Companies are no different. The two most popular Web-based business applications right now are for managing payroll and customer accounts — some of the most sensitive information companies have.”

The inertia of personal computing

The news, in January of 2008 in personal computing is the MacBook Air, an ultrathin version (0.16 inches at the thinnest point; .76 inches at the thickest) of its more accessible range of laptops, aimed at the general public and the educational sector.

The main product news is its industrial design- a sign of the house when dealing with Apple-, a backlit LED display and a more powerful wireless connection than previous models.

As an improvement on existing models, Apple highlights in the technical specifications section, the environmental impact of the product: a recyclable aluminum structure, an LCD screen without arsenic, internal wiring without PVC, recyclable and low volume packaging; besides its low consumption.

Months after Steve Jobs published the document “A Greener Apple“, this laptop is the first product with important advances in improving the environmental impact of the brand.

With the purchase of a MacBook Air computer, the user uses by default the Apple operating system, Mac OS Leopard, whose latest updates avoid the scores of problems that some users have reported during the first weeks of its launch.

It is also possible to use Windows XP or Vista with Macintosh machines. Similarly, a PC can have installed Microsoft Windows and any another alternative, including the different versions of Linux, such as Ubuntu.

Windows Vista

Microsoft has assured that it will present the first big update for Windows Vista (an update known as SP1, or “Service Pack 1”) in the first quarter of 2008; Mary Jo Foley, of ZDNet.com, points to February 2008 as a more probable date.

Various experts have repeated in recent months that Microsoft Vista would not gain great acceptance among large businesses and advanced users, since both corporations and hardcore users would wait for the first major update to decide whether to adopt it or not.

The motives: until this first update (SP1), errors and incompatibilities were frequent and not even Microsoft would recommend installing the new system in machines with critical information. Avoiding lawsuits is a strong motive for caution in its recommendations to its corporate users.

Microsoft Windows Vista has received decent reviews in recent months (that of CNet among them, with a score of 7.8 out of 10), although also very harsh critiques and even a desertion heard throughout the specialized and inbred technology press (Robert Scoble, a veteran journalist and blogger on the Windows platform and an ex-Microsoft employee, decided to buy a Mac for his new computer, instead of opting for Vista).

Beyond the graphic improvements and the thousands of usability details, some criticized by users, the benefits of the system appear limited. Tests have been published that show that XP is faster than Vista on equal terms. Other knowledgeable voices argue that there are few improvements with regard to the previous version of the system.

Microsoft Windows Vista requires greater data processing power (a faster processor and graphic card, mostly to enable the Aero graphics of the new system), which translates into a greater energy expense.

According to Information Week, the migration of XP to Vista has created an opportunity in the operating systems market, not only for Apple, but also for those who dare to market machines with versions of Linux.

A paragraph of the article, written by Serdar Yegulalp and Mitch Wagner:

“If you’re one of those Windows users who are less than enchanted by what you’ve seen of Vista and you’re thinking about switching, you face some tough choices that can make you feel like a pioneer. Is it a good idea to move to a Mac, with its easy interface, high level of safety and stability — and higher prices? Or is it better to adopt a Linux distro, which is free (or, at least, inexpensive), supported by a range of imaginative developers — and not quite newbie-friendly? Either decision forces you into new, unfamiliar territory.”

  • Petition to the The New York Times tech journalist David Pogue- published in his blog- on the efforts of a group of Windows users who prefer to stay with Windows XP and not to emigrate to Windows Vista and are fighting to prevent Microsoft from killing XP in June 2008 as planned (Microsoft will stop offering backup to XP and, in this way, oblige businesses and private users to adopt the new operating system).

Mac OS X Leopard

Apple, the renegade of the personal computing for the last three decades, continues to maintain its halo of rebelliousness, self-confidence, and corporate creativity, in the San Francisco style. The eyeglasses frames of Steve Jobs (his chosen model here), the charismatic CEO, as well as the famous attire of his presentations (black turtleneck and faded jeans) reinforce this image.

Perhaps because of this, Apple is always more resistant to criticisms (American journalist John C. Dvorak seems one of the few voices truly critical of the Apple brand) and their image has remained free from any debate over the quality of its products.

The iPod and the iPhone are currently responsible for more than half of Apple’s business, and its importance will grow, according to analysts. Consequently, in 2007 Apple Computer become simply Apple. The tag “Computer” no longer made much sense.

As in the movies, where the noble, attractive or intelligent characters always use a Mac and the evil types use huge PCs, in the real world this more positive image Apple also exists.

Despite the strong brand, the launch of its latest operating system was, for many, a disaster. The negative opinions weren’t as numerous as those for Vista, but they exist: Mac OS X Leopard treated its first users (known in technological slang as early adopters) with little tact (here and here).

With regard to other versions of Mac OS X, Leopard is not groundbreaking. Its architecture remains closed. Its price: 129 dollars for 1 license, 199 dollars for 5 licenses.


The price that a user will pay, indirectly or directly, in acquiring Windows Vista or Mac OS X Leopard is superior to that of some basic computers that can be bought with Linux as an operating system. If one uses applications and services housed on the Internet, these computers, increasingly more popular, function as well as any another data processing machine.

Although the price is only one of the reasons that users critical of the computer industry should consider adopting some of version of Linux.

From the energy perspective, a study by the British government argues that the adoption of Linux delays the obsolescence of computers and this move could reduce electronic waste by millions of tons per year.

The report claims “a typical hardware refresh period for Microsoft Windows systems as 3-4 years; a major UK manufacturing organisation quotes its hardware refresh period for Linux systems as 6-8 years.” An extensive migration to Linux would prevent millions of tons of electronic waste, according to Ecogeek.

  • The majority of Linux versions are capable of functioning without problems in computers with a Pentium I processor and 128 MB of memory.
  • The most popular versions and those least complicated for inexperienced users are free and can be downloaded online. Any user using Windows XP, for example, can connect to the site for any version of Linux, download the application and install it. After reading the instructions, it’s recommendable to have another way to access the Internet, to avoid installation problems or frustrations that might cause users not to take advantage of the free operating system they have downloaded.
  • Any computer can function with Linux. Even old computers that you no longer use, but have stored at home.
  • Computers with Linux are cheap and open to exploration. They promote the technical autonomy of the user and are a fundamental pillar in encouraging an adolescent to love math and computing. The results are spectacular in educational communities worldwide that have opted -above all by motives of cost- to install Linux on their computers.
  • Linux machines are not based on the commercial concept of downloading applications, but in that of carrying out tasks and communicating in the optimal manner. An example of this is the excellent work carried out by the organization OLPC with its XO laptop.
  • Linux machines consume less energy: the graphic requests are lower and the hardware companies that have released computers with Linux prefer to do it with machines that are simple to use and with limited power. The computer is, for these companies, an affordable and simple medium to connect to the Internet, and not a flashy multimedia tool.
  • Linux is based on standards that we can all improve upon.
  • There are several reasons that make Linux a convenient alternative, among them the lack of virus attacks and absence of frequent updates which cost the user money.

Linux (its distributions, or releases of software projects, are known as “distros”) has always played the role of the underdog. Why is its presence so marginal? No version of Linux has gained mass popularity among users until relatively recently.

Linux is not easy to use, claim its critics. As a response to a blog entry about why this open source operating system is the authentic alternative, one reader wrote, “I’m still waiting for a Linux distro that runs games and other Windows apps and you don’t need a diploma in programming to use.”

This humorous commentary, that touches on the issues, does not take into account that many users don’t play videogames, nor that the arrival of distributions like Puppy Linux, Ubuntu, gOS (an adaptation of Ubuntu aiming to offer a simple and elegant backup to Google applications) or Xandros Linux (this latter “distro” is commercial and costs 100 dollars) simplify the use of the system.

Both Ubuntu and the latest version of gOS, released at the computing conference CES in Las Vegas in January of 2008, have captured extensive media attention and are the two systems with more extensive use among less specialized users.

gOS Rocket (version 2.0) incorporates Google Gears and, therefore, users can work off-line on documents and e-mails which are brought up to date and can be sent when the user has an opportunity to connect again.

Popularity for the penguin

The hour of computers with simple and attractive versions of Linux has arrived, if we pay attention to some forecasts. What has changed in the past year?

Basically, there is a market; inexperienced users who are willing to go beyond Macintosh, for an alternative to a PC with Microsoft Windows. And that alternative is a lot more accessible, in philosophy and technical characteristics, to web 2.0.

For the first time, Linux ranks -in an overwhelming manner- among the most popular technological products of 2007 in the Amazon store (Amazon Best of 2007).

Linux is among the more popular of this list for 2007 explains Slashdot; when the lists of Amazon point to a trend, it’s not necessary to be conservative in their interpretation, these are votes that come directly from users, and, therefore, have a certain value, given that the universe of participating users (Amazon clients) is more heterogeneous than, for example TechCrunch or Slashdot.

In the “computers” section of the list, appear:

  • Nokia Internet Tablet PC (with Linux) as the most sold product.
  • Asus Eee 4G-Galaxy PC 7-inch PC mobile Internet device, as the product appearing the most frequently on user wishlists.
  • The other two popular products on the list (the best reviewed and most often selected as a gift, an Apple MacBook Pro Notebook PC), are systems based on Unix (Mac OS X). No Windows products appear on the list.

Serdar Yegulalp writes in Information Week that Linux has moved beyond its role as a project by and for enthusiasts of open source to one of the most powerful and important forces in the world of the software, thanks to its recognition as a system in many applications and complex programs designed for the Internet.

Linux, writes Yegulalp, is “now shaping up to be an increasingly viable choice as a desktop operating system, thanks to the effort of both the volunteer community and the companies that are banking on Linux to move them forward.”

Nevertheless, this specialist in Linux assures that Linux is not prepared for everyone. “I know both experts and regular users alike who have switched to it, as well as experts and regular users who have tried it and stayed with other things (whether Windows, Mac or another flavor of UNIX). But Linux is unquestionably drawing in more people than it did a decade ago, or even five years ago.”

Several businesses are taking advantage of a greater awareness among its users regarding energy use, as well as more attention to business ethics, to market their first computers with some “distro” of Linux as an operating system.

Currently the following, among other low consuming and inexpensive alternatives, use Linux as an operating system:

  • OLPC Laptop XO-1 (Laptop, 200 dollars, Fedora Linux with graphic interface for users of Sugar), or laptop for the children of poor countries.
  • Everex Green PC -gPC- (desktop, 200 dollars, Ubuntu with graphic interface for users of gOS Rocket, should work with online applications like Google Apps, Google Docs or Zoho Office Suite; relies on Google Gears for work without connection).
  • Everex CloudBook (laptop, 399 dollars, gOS Rocket with Google Gears).
  • Shuttle Linux PC (desktop, 199 dollars, Mandrake Linux). In the press.
  • Asus Eee PC (laptop, 299 dollars, Xandros Linux). In the press.
  • Linutop (desktop, 99 dollars, Linux 2.6 kernel). In the press.
  • Zonbu Zonbox Zero Emissions PC (desktop, 99 dollars plus 12 dollars per month for a package of services that includes storage and remote management through the cloud computing service Amazon S3). The production process of the Zonbu hardware complies with the EU guidelines on plastic, based on the RoHS directive. The first computer in the world to be carbon neutral, thanks to the machine’s low consumption and to the emission rights purchased on behalf of the company from Climate Trust. In the press.
  • Mirus Linux PC (desktop, from 199 to 284.99 dollars, Freespire 2.0 -Ubuntu adaptation-).
  • Aleutia E1 Desktop (desktop, 179 pounds, Puppy Linux). Possibility to buy an accessory kit to work without being connected to electricity, thanks to a small folding photovoltaic board. Photos. In the press.

It seems that PCs are becoming lighter or almost disappearing. When all is said and done, if we want to reduce carbon footprint and continue efficiently using the Internet applications that we use daily, it is not necessary to have a workstation with the sound of a reactor as a computer.

The new market of low consuming and inexpensive computers with some distribution of Linux as an operating system is so promising, that one of the main developers of the XO left the OLPC team to found a business, Pixel Qi, whose objective is to create a low consuming laptop operating on Linux that would be worth 75 dollars (coverage of the news in ZDNet, The New York Times and TechCrunch).

Objective: the Internet

In these devices, the objective is the Internet: mobility, permanent connection (Twitter, etc.). Contents generated by users. Long Tail. Web 2.0.

It is perhaps the first example of what happens when applications are on the Internet, documents are kept online and it is not necessary to store anything locally, to the extent that it improves the quality of the connections.

In this context, an increasingly more extensive group of users will stop using Windows and Mac OS as references for leisure and daily work. They will go in search, according to this idea, of a connection device that has the capacity to process, is easy to use and comfortable, has a browser, a wifi connection and little more.

If this calculation plays out, the result is a laptop very similar to the XO of OLPC.