Wednesday, December 13, 2006

The Secret of MegaTasking - Revealed!

Just had a chance to talk to Henry from the A company, to understand the funky terms - MegaTasking ...

Q: What is multitasking?
A: The running of two or more programs in one computer at the same time.

Q: Then what is the different between MegaTasking and multitasking?
A: MegaTasking has some similar sense with the multitasking, but it is more than just computing?

Q: Can you elaborate further?
A: Sure! .The MegaTasking is about convergence. While our competitor is/was talking about the computing, networking and communication convergence, we aim much higher. It converge almost everything for your daily life stuff. For what you do in the living room, kitchen, laundry, neighbor, society, lan party, (342 words omitted). With our sophisticated design, advanced process, brilliant individuals, smart executives, enormous fanbois base, (1178 words omitted) ...

Q: So ...? What is it has anything to do with the living room? are you talking about your 'Live' stuff?
A: Nope, more than that.

Q: Is that so?
A: Yup, the moment you turn on the MegaTasking in a living room, you got a PC and heater.

Q: What if it is summer?
A: Err ... you just turn your living room to a Sauna Spa.

Q: ... then the kitchen?
A: With a proper casing, you just got yourself an oven. You can look at the screen for the recipe and bake the cake at the same time!

Q: ... ... then the laundry?
A: Just put your wet clothes close to the fan, it will instantly dry. Better than most of the commercially available clothes drier!

Q: ... ... ... then the neighbor?
A: What's more fun than directing the noise to your stupid neighbor that use our competitor's product? I'm sure our fanbois base would love this.

Q: ... ... ... ... and anything else?
A: What's even more fun than 'legally' disturb your opponent with noise and heat in lan party game competition?

Q: ... ... ... ... ... and some more?
A: The MegaTasking is a innovation for someone innovative. Think of it for few minutes, i'm sure you can list out more than what i have said.

Q: ... ok, anything else to say?
A: Yup, our MegaTasking is fast.

Q: How fast?
A: It can consume 1 MegaWatt in just 41 days consider full day usage. Our competitor not even close to that.

Q: What about the computing speed?
A: Sorry, I gotta take a whiz ... bye.

Friday, December 08, 2006

Another Joke of the Day

Quote from http://www.newsfactor.com/story.xhtml?story_id=013001BYD5Z4
In making the announcement, AMD executives said that even at 90 nanometers and 90 watts, its chips, on average, consume half the power of an Intel Core 2 Duo. But Athlon's power consumption will drop even further, AMD said, with the 65-nm chips that will run at an average 65 watts.

Wow, the AMD executive hiring criteria is able to lie without the eye blinking? :)

Friday, November 10, 2006

Funny Q&A

I just come across this funny Q&A that I really can't resist to put it into my blog, so that may be some one can decode what the answer 'really' means :)

excerpt from the INQ interview with the AMD guys

INQ "That brings the question of drivers. AMD has been a staunch supporter of Linux, while many users of ATI had a hay-ride with drivers for Linux operating system. Nvidia, as the prime competitor has support of Linux community, while every once a while we hear news about petitions to ATI, drivers not working as intended."

Phil "AMD is driving the industry to an open world, and we focus our strengths and with combined approach, achieve what's best for development of the industry around us. All ISVs are important for us."

Btw, just had a conversation with my friend, asking him how he feels about the current national politics, and he said that that his son is almost 3 year old and ask me how's my son. I answered that the table is made out of solid wood.

Found another jokes from dailytech

When asked if AMD has any concerns that its users may choose Intel processors if supplies of AMD chips run dry, DiFranco responded, "We don't expect our users to jump brand. Their loyalty comes from many years of dedication, and they're a sophisticated group. We think they will stay loyal over the long term; they're better served by sticking with AMD technology."

What a marketing guy! Anyway, a side thought here: if most AMD executive think the same, then AMD is soon to be in trouble, as one commenter at that site has said my mind:
and isnt that the same mentality that hurt intel? I love AMD but intel is back for now so why would i stick to amd in the coming time.

Wednesday, November 08, 2006

Web 2.0 Impact to the Computer Industry

First and foremost, I hate the term "Web 2.0", simply because it is trademarked. It is simply stupid. Tim should have disallowed that. Anyway, just put this aside and talk on what I think about the impact of this Web 2.0 to the computer industry.

People would easily argue that with the advancement of the Web 2.0, thick client is no longer needed, thin client and powerful server is the future. Well, to some extent, this is true. This would translated to increase server demand, better broadband connectivity.

However, the above deduction is just simply too simplistic. Let's use the definition from the http://en.wikipedia.org/wiki/Web_2.0 and real life example to illustrate this.

The Web 2.0 characteristics as listed in the Wiki is as follow
1) "Network as platform" — delivering (and allowing users to use) applications entirely through a web-browser.
2) Users owning the data on the site and exercising control over that data.
3) An architecture of participation and democracy that encourages users to add value to the application as they use it.
4) A rich, interactive, user-friendly interface based on Ajax.
5) Some social-networking aspects.

The first deduction that we made in the second paragraph would be true for the point #1. Google or even Microsoft would (eventually) enable some office application through the webs, be it in Internet or more powerful version through a company's Intranet. Most of the day to day jobs, be it business or engineering work can be done through the network(for the engineering case, i mean remote session to a server here)

The point #2, and #5 however, take YouTube as an example, still has room for the thick client. A powerful client would allow users to encode their video and to some extent add in some funky stuff instead of pure video encoding, at a more comfortable speed. Besides that, wireless broadband will be a hit as users able to upload their contents, at anywhere, and anytime. Mobility is also key here and hence, mobile device will be a plus because of the Web 2.0. In this sense, computer industry has to fight with the phone industry. Computer industry has to make use of its much higher processing ability to create contents that's not so possible to be done in the phone at the same period of time. Complex but easy to use video editing tools will be a point here.

To some extent, I believe online gaming provided some Web 2.0 characteristics. It provides interactive meeting place, creative expressions ability, and some even allow user to have some effect on the game scene, etc and I'm sure newer game would have more features that what i listed here. The 'virtual world' will definitely need a powerful client for the better 'virtual' experience. Unless broadband bandwidth can increase so dramatically, those virtual scene, etc, would still need to be handled by the client.

Web 2.0 will not cast doom to the thick client. Instead, it can be another inflexion point to all segment of the computer industry, a boom to then thin and thick client, as well as the server.

Wednesday, August 23, 2006

The UNTOLD Reason why AMD will have to go with Native Quadcore

While Intel is releasing its quadcore soon (expecting Q42006), AMD will release its native version of quadcore in 2 or 3 quarter later than Intel. AMD (and its fan) also tried to play down Intel's non-native version, and claiming the native approach is better.

What people failed to realize what if AMD using Intel's approach before come out with the native version, the end result will be a Internally NUMA quad core, which is bad for mobile, desktop, or even as a NUMA node for the server MP. So, it is really not a matter native if better than not native for AMD, it is just that the non-native approach is NOT good for AMD.

Why bad? As of current (and foreseeable future), there is not much (if not any) apps get wriiten for the NUMA optimization. And most desktop/laptop apps doesn;t require that level of memory bandwidth (the NUMA has better bandwidth but with a catch - need software optimization that doesn't work in all workloads). NUMA is a sensible thing in server MP, not in desktop or laptop. Having 2 memory link also raise the system cost, making it unsuitable for cost concern market.

For mobile specifically, it is mainly driven by form factor, power and wireless. 1P is definetely as solution for it. Having NUMA within the 1P, it just means it need minimum of 2 dimm and might not be a good candidate for certain very small form factor mobile device. The unncessary memory bandwidth in most mobile application also causing the power to be up, while not gurantee significant improvement. (I'm not sure if certain apps would show negative improvement)

For server MP particulaly, if this internally NUMA chip is used as a NODE, there will be multiple node distance in the whole MP design, which again making the software optimization harder.

Wednesday, July 26, 2006

Da Vincci Hinted about AMD-ATI merger

An internet site reported that an unseen Da Vincci manuscript was found yesterday and to the surprise of the researchers that the AMD+ATI merge was predicted by Da Vincci hundreds years ago. The below words were stated numerous occasion within that manuscript

DAAMIT
I.AM.TAD
I.AT.MAD
AIM.TAD
MAD.AT.I

:)

Saturday, July 22, 2006

IMC Myth

There seems to be an overhype on the IMC within the x86 CPU. I am not bashing on the goodness of having IMC and thus lower the memory latency, but my point is that it is simply overhype. Every single piece of feature within a CPU, is an engineering decision in one way and another. The main focus are the overall system performance and the target platform usage models.

Intel's new Core 2 Duo, is wothout the IMC, and yet still out perform the AMD's K8 with IMC, speaks for itself. It is not that Intel will not use IMC, it is just that it does not need it yet.

Then there are people argue about the scalability (about the NUMA vs UMA), saying that the C2d would not scale as good as AMD's; and in 2 years time , AMD's CPU might take the lead again due to this. Well, i wouldn't disagee on this scalability issue and the future possiblity of AMD taking the lead again. But who cares? As a desktop and laptop user, If I were to buy a decent system nowaday, I'd defintely go for Intel's C2D, at least for now. The scalability issue is not my issue, but the Intel's architect and design engineer issue. Some might want to further question this: "yes, intel can raise the FSB frequency and enhance cache design for 2 core or possible 4 cores, but it surely hit bottleneck when it designs 8 core and above". Well, it is not a user's concern. It is again their design team's concern on how to over come this, be it using IMC or other method.

Wait a minute, what about MP? As far as my desktop and laptop concern, I will not be using one , and again, at least in this few years time. Why should I incur such rediculous hardware cost and possibbly the softwatre cost while a decent single multicore processor can do the job?

Having said all that, AMD with its IMC and ccHT (hence NUMA) did give it an advantage at the 4P and above server end. The IMC is definitely not a case at the desktop and laptop end as of now, and not a case in 2P server as well because the dual FSB chipset is available.

Friday, July 21, 2006

From Fanboism to Extremist

Intel has just started its marketing campaign around its Core 2 Duo few months back, and that's exactly the time I started to participate in the blog commenting. There were occasionally funny comments appearing in technology news feedback, protraying fanboism if not extremist :)

After sending my comment to this sharikou.blogspot.com today, then I have a though why not I start my own blog and to express my view on those fanboism comments. Of course, I might write my thought on the technology as well, especially regarding to the computer industry.

AMD fanboys, to the extreme, Sharikou, likes to disagree on whatever Intel does, discredit is ability, bad name it, and most of the time, make funny judgements. While i would not comment on his personal view, but will try to prove some fact here by using logic, in funny way :)

Sharikou think that AMD manufcaturin capability is far superior than Intels. He would Intel has bad yield as compared to AMD, use 65nm with unmatured yield, etc. The list is quite long, if you are really interested, can visit his blog.

Below is the logic to prove him wrong

quote from informationweek
In some cases, executives said, AMD walked away from business when price points became so low the deal was deemed unprofitable.
Henri Richard, AMD's executive vice president of worldwide sales and marketing, said AMD would only take business that makes sense for the company. "We are not going to chase what I call lighting a cigarette in front of a gas leak," he said.

CPU prices from below
http://www.hkepc.com/bbs/itnews.php?tid=633569
http://www.hkepc.com/bbs/itnews.php?tid=632181&starttime=0&endtime=0

Minimum AMD CPU price is USD51
Minimum Intel CPU price is USD39

Please allow me to use my limited logic analysis here:

1) AMD is a GOOD company, and will do good thing for humanity
2) Selling CPUs is to gain money, no matter how little, is true for both Intel and AMD
3) Hector is good man and won't lie

Assume everyone want to make at least 10% for profit but AMD can't push it below USD 51. So, i'll assume its low end CPU cost is USD 46, and intel low end CPU cost is USD 35.

Should AMD make more than 10% on it and so that can prove Shariko's point AMD's APM is far more superior than Intel's Copy exactly? it can't without violating point number 1 and 3. Since AMD is so good and supportive to the humanity, of course AMD would support USD100 PC initiative. The key is low CPU cost. A GOOD AMD would definitely sell cheaper CPU when it can and a good Hector would not lie.

So, can intel actually selling at lost? it can't either since Sharikou think that intel is so evil and thus it would simply will not make its CPU to support the PC USD 100 initiative or to sell it at lost.

So, the conclusion is what the industry has recognised except for Sharikou, his fren mike and the AMD marketing VP, Intel has far more superior manufacturing :)

Btw, in his post, there are endless joke. From the Dell Laptop explosion to be cause by the Intel CPU (he managed to related the 2 explosion sound to dual core ...), to Intel will bankrupt in 7 quarters.

Anyway, I'm not predicting AMD to gobankrupt here and I believe AMD will continue to be a strong competitor to Intel, despite the fact that Intel is taking the lead currently.