Pages

Google+

Thursday, April 18, 2013

The Synthesis of Thin & Fat Clients

My buddy Vic once used the phrase 'mega trends'. I like that that phrase. He used this term while giving me his take on CORS and how it will shift things back to the browser based Fat Client model. Firefox OS and ChomeOS are blurring the lines of thin and fat clients; yet, something feels missing in this war of thin vs. fat clients. What's the missing link before the next cycle happens? What will it look like? When it does happen, how would I think about Product differently?

The missing link is we're going to have an evolution, a synthesis, of thin and fat clients.


Just Another Cycle of Thin & Fat Clients?


In the 60's, 70's, 80's, 90's and 2000's there were cycles of thin and fat clients. In the beginning you used Unix and some shell to access another computer. This was a thin client and it remained so because any meaningful computational power had to be done on a really big machine housed in another room. Next, computers got small & powerful (PCs), so you then had thick clients.  Ethernet came along and now thin clients were possible again. When the internet came along, data transmission was expensive so the fat client came back and then browsers starting having their own thin and fat client cycles.

All these cycles were based on two things, the computational power of devices and the speed which information could be transferred. When one of those two pieces became cheep, we could then decide where to house content and business logic; ping-ponging between locally or remotely.  

The important point is that it was always a breakthrough in just one of those technologies which caused a pivot; it was asynchronous. Now, it's different because access to both power and content are cheep. 


More & Faster!


Google fiber makes up part of the missing link. Until technology like this becomes ubiquitous, products will struggle with the thin / fat client model. The key lies is in what Hunter Walk says:

"...the gap between you and Internet totally disappears. The computer is responsive in a manner that I've never experienced before."

Take note what he says. He said the 'computer' was responsive - not the browser or internet. The internet totally melted away for him. When you can download a 1GB file in seconds that changes things - big time.

The next missing link is combining distributed computing, which everyone should be familiar with by now, with open source implementations such as Hadoop. Today, crushing terabytes of data has never been easier. 

Now, when you combine the fact that you can transmit gigabytes of data really fast and crush even more data faster....what happens....?


The Synthesis of Thin & Fat Clients


To illustrate what happens when both computational power and data transfer evolve in sync, we'll imagine what using Photoshop would be like.

Photographers use Photoshop, Bridge and Camera Raw to manage large groups of photos and edit individual ones. They might do a specific edit to one photo and they might apply an edit to a batch of photos. Depending on what's being done and the speed of the computer, the computation can be slow or fast. Now, imagine if all the edits were fast....not because the computer was fast but because Photoshop detected a data intensive operation was about to happen and sent it to distributed network of machines. This network of machines instantly compute and send Photoshop the result. The user won't even notice it.

Get it? There is no more 'thin' or 'fat' client. There is no asking:

 'where do we house all the business logic and content?'

Instead, we'll be asking:

'which parts of the business logic and content will be local; which parts will be remote?'

The user won't even know. Everything will just be fast. If you're working on a Product, you better be planning for this - now.