Not heaps. $2000? Is that the monitor in AUD, or a whole iMac? I don't know what all the iMacs have used for various displays. With my father being an admin, I've had the luxury of growing up on good leftovers, and got to use enough different early LCDs to begin learning what made up the very obvious differences. There's nothing wrong with monitors that aren't expensive--I'd get an HP w2338h, if I needed a new one right now (230 USD)--but others aren't much more expensive for no reason.
Apple charge $1600 for the monitors, I got mine for $190 each (160 USD). I probably rounded up rather too far there.
I've yet to enough work on any *buntu for any length of time. ALSA or Pulse breaks, or X never works right, or I can't compile something, etc.. I try every version, and I've yet to have as smooth of an experience as with other distros (distros that exhibit no problems tend to have older software than I want, such as SimplyMEPIS). Ubuntu consistently seems too delicate.
I'm the opposite.
I'd been using debian, slackware, arch, suse, mepis, mint, pclinuxos etc, and refusing to use kubuntu because of all the naysayers.
I used to have heaps of problems, but kubuntu has run perfectly every time.
Sometimes ATi's driver has a fit, but that's the only problem, and actually hasn't much to do with the OS, but the hardware manufacturer.
No idea, then. Though, I'd have to go cheap and get a Phenom
I was originally going to get a phenomII, but the price difference between the two was so minimal (1300 vs 1600) that I figured waiting longer and getting an i7 was better. My problem is that there's always another step up that isn't too much more, so I keep saving... and then there's always new hardware around the corner, so I wait for that too.... I end up putting off getting a new pc for ages.
Endless Mike wrote:
Yes, a full Unix and POSIX-compliant system is dumbed down.
Get over yourself.
I know it's fully POSIX-compliant, and is unix, but it IS dumbed down.
There's almost no room for cusomizability, it's crippled
. Every mac screenshot looks identical
With windows and linux, one actually uses the os, tweaks the shell or WM, etc. I edit the registry and the config files to make it act precisely how I want.
mac appears to have people not use the os, but rather just the applications.
I'll say no but only because there has to be someone out there with a Mac Pro he bought for gaming, or someone with a Hackintosh he uses OS X on for everything but gaming (which barely even counts). It's very atypical of Mac users, however.
So... would you suggest that the number is smaller than even that of linux gamers?
That's so odd.
Here, it seems like nobody uses the damn things, excepting uni students with macbooks (that a large number put linux on).
Stores don't really sell apple computers, but windows is everywhere.
I don't think it's the easiest to use by a long shot unless you're assuming a user who has never touched a computer in their lifetime. Shit, wouldn't something you call "dumbed down" be the easiest by definition?
I am assuming someone who has not learned how to use any os.
In linux, you chuck the disc in, click next, next, next and it's installed.
Installing apps is as simple as searching for what you want in the add/remove software app.
It comes with office, picture viewers, the best web browser, torrent clients, a fully featured burning suite, and so much more.
Honestly, Linux with KDE is the simplest and easiest to use, the best looking, the most advanced, and the most customizable.
They do cater to those that care about value, just not the value you care about, or I care about. Particularly those that find the cheaper units from other vendors, with Windows as their main OS, to be poor values. Value is about what you get for what you spend, not absolute price. Something cheap is not always something of value. Something expensive is not always a good value. They don't need to get a large market share. They need those with a large market share to not bother to cater to the minority that may find Apple products to be superior.
The only people targeting apple are canonical, from what I can gather.
Microsoft stated not so long ago that the #1 enemy was piracy, and #2 was linux.
Mark Shuttleworth said that he wanted ubuntu to be as polished and unified as OSX.
All that functionality is still somewhat limited, and is not terribly mature. AMD and Intel are, IMO, both going in very good directions for long-term wide adoption (nVidia seems to be trying to make chips that are too expensive for their own good). Right now, though, you'd want to know that the specific application you use can use it for the specific tasks you do, before you decide to buy powerful video cards. When about everything has support, and the GPUs can do every operation on every data type that your CPU can do, including doubles and all ints (read: Decimal, and Decimal-like bignum), with exceptional performance in highly parallel operations compared to your CPU (if the worst case is >4x of a good CPU for large matrices, you'll always want to use the GPU for them), then it will be guaranteed to matter.
So, it's not quite there yet, but soon it will be.
You think NVidia are going to fall somewhat?
Opinions on larabee?
I don't have a source, but really, it makes sense. Everything in my PC, all totaled, has cost me just about $1000, and could be replaced by quad-cores, with as much performance per core, and a decent fanless gaming video card, for under $700, including the above mentioned monitor, a high quality CPU heatsink, nice PSU, etc.. Most people don't buy PCs much above $500, these days. Apple, OTOH, doesn't have too much available under $1000.
With cloud services becoming so prominent, I guess people really don't need to.
Yes, there are. It took me almost five minutes of interrogation, just a couple days ago, to figure out that what my mother wanted to do was rotate pictures 90 degrees and put text labels on them. Lots of people don't want to have know anything remotely technical about how to do things with computers. They want them to be magical appliances.