SQL Server, Analytics, .Net, Machine Learning, R, Python
Mitch Wheat has been working as a professional programmer since 1984, graduating with a honours degree in Mathematics from Warwick University, UK in 1986. He moved to Perth in 1995, having worked in software houses in London and Rotterdam. He has worked in the areas of mining, electronics, research, defence, financial, GIS, telecommunications, engineering, and information management. Mitch has worked mainly with Microsoft technologies (since Windows version 3.0) but has also used UNIX. He holds the following Microsoft certifications: MCPD (Web and Windows) using C# and SQL Server MCITP (Admin and Developer). His preferred development environment is C#, .Net Framework and SQL Server. Mitch has worked as an independent consultant for the last 10 years, and is currently involved with helping teams improve their Software Development Life Cycle. His areas of special interest lie in performance tuning
Sunday, July 30, 2006
More comments on “Why Vista ?”
Several MVPs have been talking about why Vista will or won’t achieve early widespread adoption and the underlying reasons: Why Vista?, Vista and Cars, Ford Falcon or Plymouth Fury - Is Vista good enough to sell?
I like playing Devils Advocate! As I mentioned in my reply to Darren's post, I see Vista as overwhelmingly more important for developers than end users in the short term. I agree that applications maketh the OS (isn't that always true?). End users want new features when they make their life easier (after factoring in the pain of any re-learning process)
Being able to blog from within Office 2007 is great for people who have a blog, but does the average user care? No, of course they don't. Where are the adaptive applications that learn a user’s habits? That adapt to specific behaviour (not quite the same as annoyingly hiding infrequently used menu items!). If you give users software that feels like 'it cares' about the user you get ‘buy-in’ and a higher tolerance for change from them.
Here’s an idea: build in the ‘Adaptive Decisions Widget’ that keeps a track of all (revertable) decisions that the OS and Applications make on my behalf, for instance if I consistently reply X times to the same question, ask me if I’d rather not be asked, and store a decision if one is made. It’s a bit like Alan Cooper’s design ethic: don’t ask me to confirm a delete action, just peform the action but make it un-doable. Give me a warm fuzzy glow, instead of a resigned sigh!
Mitch Denny wrote:
>>"What operating system upgrades do is set a baseline which can shorthand
>>discussions, for example, because I know that Vista is going to ship with WPF, WCF >>and WF I can simply say “you must be running Windows Vista to run this software”."
I not even sure that will be true: what about new versions of the framework? (we've had two versions in this XP iteration). We have even seen people in the ausdot stanski list saying that they had to stick with 1.1 of the .Net framework, due to the inability to get 2.0 installed (due to its size). Even if it is true, I still think it directly benefits developers of new software more than end users. If you write a new application and want to maximise your target market and profits, you do not force users to have the latest OS and nothing else!
Inside many organisations, the main (even single) driving force for developing browser based applications is the zero intsall footprint. Having gone down that path, there seems little incentive for such organisations to swap to vista so that you can re-write all your web apps to run as windows forms taking advantage of WPF, WF, WCF, WWF.
There are a heck of a lot of large organisations still running Windows 2000!
I lay down a challenge: name a reason why an average office worker will be better off with Vista! (I’m not saying there aren’t any, just curious that’s all…I wonder what the official marketing line is?)
MSN, Email: mitch døt wheat at gmail.com