The Way of Google's Future September 23, 2005
I came across an article in my favorite tech news site, ZDNet, that said Microsoft had predicted 10 years ago that the Internet is the next platform. But, Microsoft still spent bazillions of dollars making Windows XP and the new Windows Vista. Meanwhile, under Microsoft's radar, 2 Stanford students develop something in their dorm room, a search engine, and in 2005, they are big. HUGE. Google. With Google's way of innovation, and their ideas and having the top minds in the field (except they don't have me yet :p ), they are developing a lot of things, and they are not platform specific, but they are for the internet. Gmail, Maps, etc. They have more, and you'll see them by visiting their beta section. So, now Microsoft is feeling the heat. Without having a specific operating system, you can use any of Google's Internet products. Microsoft just wants to take over the world, so they will fight this, and start doing their own, or they just don't want Google to get too big, because then they can go stealing all their smart employees, paying them the big bucks, giving them the Presidential Suites, etc, and using them to develop products specifically targeting Microsoft products, instead of Microsoft doing it to them. The playing field is leveled a bit.
It's an interesting concept, the Internet as a platform. How I picture it, the possibilities are endless. Before I had that vision though, and before I read that article, I had pictured something a little different, something like Google's platform, the latest desktop search. Plug in components into a base platform, and the base provides a lot of the functionality that the components need, providing quicker development. Think of Mac's Widgets. There's a widget container that can provide lots of functionality to the widgets, and then there are widgets that you can plug in. I always imagined something like an application container. I could have small apps that plug in, and you can open any of them from this container. I had thought of this before Mac's widgets, but instead turned towards internet applications. My main reason for this thought process was because of how Java works. I didn't know if you could make a Java program automatically run by double clicking it, you always have to open them with another program. Of course, have everything run under one program. (I later found out about JNLP, Java Network Launch Protocol, which launches 'JAR' files containing a Java program)
Sometimes solutions are so obvious for one problem and they aren't even considered for another problem.
What wasn't obvious to me is that this idea had already been done! In fact, everyone is doing it! When you visit a website, you are typically using an application written for the web. An application. Written for the web. My container application, the platform for running every program I write, is in fact your web browser. This seems like a great platform. Some obvious aspects that you have to watch out for are backing up data, security, limitations of certain web browsers, certain web browsers not following web standards, downtime, scalability, application flow, user experience, and users. Some great benefits to web applications are deploying, updating everyone's version instantaneously, data stored in a central location, and if you secure the server, it's virtually unhackable... if you develop it to be that way. Having a client application obviously has its benefits. You can access local resources (disk drives) and do stuff that you can't do in a web application, like video games and accessing hardware, and stuff that would kill the resources on a web server if too many people did it at once... intense applications. Basically, it depends on the application, whether you should make it a client application or a web application, and whether you can make it a web application.
There aren't too many downsides to writing a web application, but they are pretty big downsides. There is another one. HTTP. HTTP is pretty primordial. HTTP is the protocol in which web servers communicate with the world. It consists of numbered codes and data separated by line breaks. It was developed before XML. However, XML has its obvious downsides. It's heavy, lots of text. Depending on your data, XML can double the size. It's mainly used for text, so you wouldn't normally go storing your images in there. I only bring this up because of client/server applications, or server to server communication, which still falls under client/server. This is why SOAP was invented. SOAP is an XML format that was developed for multiple applications, infinite applications, to send XML data over HTTP. A standardized format is a good start. HTTP can stay as it is, as long as everyone uses SOAP. This was the advent of web services; small applications written to run on the server and communicate with the client. Usually just a function or two. There's a huge history there (search the internet for RPC or "Remote Procedure Call", you'll see what I mean), and the idea was to make a standard way, rather than hundreds of developers fending for themselves, all writing a different way to call functions over the internet.
One of the important downsides I mentioned with writing web applications is user experience. This isn't about making users laugh or showing help or different messages. This is about "perceived speed" of an application. Who wants to watch a progress bar at the bottom of the screen? Or watch as the whole website goes white and takes a few seconds for something to pop up. In client side programming, you typically develop a multithreaded application to improve user experience. Things appear to happen simultaneously. However, these applications run on a web server, and the only protocol for speaking between the web browser and the server is HTTP, which makes requests only at the user's request (hence the name) and provides responses, how in the world do you expect to make an HTML web page seem "multithreaded"?!? AJAX. You may have heard of it. It's "asynchronous" using JavaScript and XML. That's pretty much what the acronym stands for. This way, I can have JavaScript make requests back to the server without the user's interaction, typically on a schedule (every 5 seconds, every minute, etc), and get that ever-so-desired perception of multi-threading in a web application, significantly improving a user's experience.
Google has realized this. Maps and Gmail use AJAX extensively. It is the way of the future, and it is important enough that soon every browser will have it. But this isn't just about writing a web application that appears friendly to the user. It's about writing many applications that are all friendly with each other, and that all appear friendly to the user.
Imagine an internet portal, a website that you go to as the first page you visit on the web. It has everything. News, stocks, your email, messages sent to your IM client that you missed, emails from other accounts you have, voice mails from work and from your cell phone, reminders about events in your calendar, and anything else you can think of. This is Google's vision... probably. Imagine having all this personal data on one website, collected from many different web applications, each using SOAP to communicate with each other, sending XML to the user's browser on each AJAX request, and reading all this personal data on the fly, determining which advertisements to show that user. Advertising is Google's main source of income still, besides selling stock.
"But Google's also buying up loads and loads of dark fiber and buying wireless internet technologies and WAPs" you say... Yes, they have invested in a company that can triangulate exactly where you are when you connect to a wireless network. So you can search for the closest guitar shop to the exact point on which you are standing. This on a portal full of all of that other information I mentioned would just be showing off.
This is where I think Google is heading. As with its search technology, I think the Internet can do better. I must emphasize this. I've mentioned this before, here. I think all of Google's web applications will supply their data this way. I quote myself:
"Imagine, if Google, instead of just reading all of the HTML through a website url, can just ask a website "Yo, what's your deal?!" and the website can respond back "Dude, I am a guitar shop, here are my wares.""
RDF is this for news. Somehow Google is able to extract prices of goods on websites as well, and build a shopping cart around them. But instead of Google just being able to search these results for items you may be looking for, what if there was no website that actually sold this stuff, but Google just read data from a server, through another protocol, and did everything: shopping cart, credit card processing, etc. Google would be the only online shop. Or, what if someone else did this. Like me! No, there's an "end of the world" scenario in there somewhere. No more online shops, just Google, and less jobs, and less money, and more Google. It could be bad, let's hope that they're only doing the portal mentioned above :)