Silentwave
Jul 14, 06:22 PM
320 would be the standard. you could upgrade to a terabyte if there are still two HDD bays.
Heck you could have 1.5TB with the new Seagate 750GB drives!
Heck you could have 1.5TB with the new Seagate 750GB drives!
Multimedia
Jul 21, 01:51 PM
Yes, with the possibility of a Mac Pro with 8 core on the horizon, it makes sense to skip the 4 core altogether. Or, start with lower end of 4 cores (say 2GHz) and then, if necessary and possible, upgrade it to 8 cores. I wonder if waiting for 8 cores is going to be a common sentiment. In that case, it would make sense for Apple to offer an upgrade path to it.There may be unknown variables supporting 8 cores from 4 such that I would not want to take that path. I would rather have 8 cores on a new motherboard with faster ram etc supported to get the most out of all of them at newer faster speeds.
littleman23408
Dec 3, 03:10 PM
Some of them do but not sure do all of them. I've got several nice rides from those series but they are mainly from higher level series.
Cool, Thanks. You must be pretty far?
Cool, Thanks. You must be pretty far?
macpross
Aug 6, 11:28 PM
Great joke, thanks very much...in the same line as Tiger Computer Dealers, right?
We already have a Mac Pro line of products, we are also the owners of AppleLocks, and MacMice. The Tiger thing was silly.
We already have a Mac Pro line of products, we are also the owners of AppleLocks, and MacMice. The Tiger thing was silly.
Iconoclysm
Apr 19, 08:46 PM
Motorola had iDEN well before Apple had an iPhone. Apple copied the i just like they did the Beatle's logo. They are he innovators of copying. But it's ok when they do it.
Motorola wasn't the first company to create an iProduct and using an Apple may have infringed on The Beatles' production company's logo (not The Beatles' logo) but it was not a US company. Do you really think that Jobs got the idea for using the Apple name from The Beatles?
Motorola wasn't the first company to create an iProduct and using an Apple may have infringed on The Beatles' production company's logo (not The Beatles' logo) but it was not a US company. Do you really think that Jobs got the idea for using the Apple name from The Beatles?
Amazing Iceman
Mar 22, 04:46 PM
well said. it's hard to even have a civil conversation here anymore. not sure what the majority of the age group here is now, but the discussions since i've joined here just a couple of years ago seem to be on the decline with immaturity. there are a handful of respectful and open minded people who do back up their thoughts with details and sense, but you'd have to wade through a lot of "fanboy" (i hate that term) jargon to sift the ones worth replying to.
True. The debate gets too personal, and starts loosing credibility after a while.
If this was a live debate, there would have been a shootout already.
Cool off people, and provide solid arguments to sustain your point.
True. The debate gets too personal, and starts loosing credibility after a while.
If this was a live debate, there would have been a shootout already.
Cool off people, and provide solid arguments to sustain your point.
HBOC
Apr 7, 11:05 PM
Normally I'd call bs, but I got mine at Best Buy and my friend a former employ asked if they had any more, the said technically no but for him they'd "find" one. Thank god I got it from there for reward pointssss!
My reward point coupons always come the day after they expire anyways. Plus newegg and amazon are cheaper on most things. Too bad circuit city went down...
My reward point coupons always come the day after they expire anyways. Plus newegg and amazon are cheaper on most things. Too bad circuit city went down...
Vegasman
Apr 27, 08:43 AM
I think is quite conceivable that keeping those logs forever, not encrypting them, maintaining them despite an opt out, and not removing the timestamps was done in the spirit of: "Let's keep the data, maybe they will be useful at some point, and why bother do encrypt them, that is just some extra lines of code to write."
And it is this spirit which is somehow worrying.
This is the most likely explanation for me (too).
And it is this spirit which is somehow worrying.
This is the most likely explanation for me (too).
cgc
Jul 20, 02:45 PM
I think I'll still get the low-end Intel Tower in August/September, but I'm curious if the XEON 51xx chip could be replaced with a quad-core Intel chip.
DaveP
Aug 6, 05:34 PM
I find it amusing how optimistic Mac users are. Every once in a while Apple has an event where they really wow with product releases, but seems like 9 out of 10 people are predicting amazing releases. By the way, I'm not criticizing in any way, and being optimistic is good.
I'm predicting Steve will annouce his retirement :eek: :p
Probably about as likely as some of the wish lists we've seen, haha.
I'm predicting Steve will annouce his retirement :eek: :p
Probably about as likely as some of the wish lists we've seen, haha.
skunk
Mar 3, 04:44 AM
Lee, first, do me a favor when we correspond with each other, would you? Please don't say "feel" when you mean "believe" or "think." This conversation isn't about emotion. It's about truths and falsehoods.If it were about truths and falsehoods, surely everybody would agree? But it isn't, is it? It's about how you feel about it.
Third, if the Catholic Church is right, I didn't make the rules. God did.You are simply avoiding responsibility for your own prejudice by an appeal to a spurious authority.
Fourth, again, I say what I believe.Or, to put it another way, what you feel.
Does anyone notice a hint of natural teleology there, hmm?What does your own condition hint at in terms of "natural teleology"? What does the homosexuality exhibited by hundreds of other species tell you about "natural teleology"?
License causes chaos.This statement indicates that you are an authoritarian with a very dim view of human nature.
I don't see any point in being sexually attracted to anyone of the same sex, since I think homosexuality is a psychological problem caused by nurture, not by nature.Well, in that you are quite simply wrong. There are plenty of studies of identical twins which prove otherwise. You should lay the blame for this "aberrant" behaviour squarely at the feet of your aberrant "god"/nature, rather than seek to persuade people that their nature is "wrong".
Third, if the Catholic Church is right, I didn't make the rules. God did.You are simply avoiding responsibility for your own prejudice by an appeal to a spurious authority.
Fourth, again, I say what I believe.Or, to put it another way, what you feel.
Does anyone notice a hint of natural teleology there, hmm?What does your own condition hint at in terms of "natural teleology"? What does the homosexuality exhibited by hundreds of other species tell you about "natural teleology"?
License causes chaos.This statement indicates that you are an authoritarian with a very dim view of human nature.
I don't see any point in being sexually attracted to anyone of the same sex, since I think homosexuality is a psychological problem caused by nurture, not by nature.Well, in that you are quite simply wrong. There are plenty of studies of identical twins which prove otherwise. You should lay the blame for this "aberrant" behaviour squarely at the feet of your aberrant "god"/nature, rather than seek to persuade people that their nature is "wrong".
gnasher729
Aug 17, 05:32 AM
They are comparing a 2 generations old G5 (Dual 2,5) versus a new Intel (Quad 2,6) which is not even the fastest out there. What kind of comparison is that?
If you want to know what is the fastest Mac, the comparison is no good. If you want to know whether you should upgrade your machine, the comparison makes a lot of sense. First, the 2.66 GHz Quad has the best price/performance ratio. If you start with the 2.0 GHz, you get 666 MHz more for $300, then you get another 333 MHz for a mere $800. So if you want to upgrade, the 2.66 is _the_ machine to buy. Second, there will be much less difference between a Quad G5 and a Quad Xeon. On performance critical Rosetta applications (like Photoshop) the Quad G5 will be stronger. In that case, it doesn't matter how much stronger - you won't upgrade, that is all that matters. But if you have a dual G5, then the question whether to upgrade or not is really interesting.
And we need to know whether apps use four cores or not. In many cases, changing from two threads to four threads is very easy (that is if all the threads to the same work; it is much harder if the threads do different work), but the app uses only two threads because most machines had only two CPUs. As an example, early versions of Handbrake didn't gain anything from Quad G5s; the CPUs were 50% idle all the time. People complained, and it was changed. The same thing will happen again, especially since _all_ Mac Pros have four cores.
If you want to know what is the fastest Mac, the comparison is no good. If you want to know whether you should upgrade your machine, the comparison makes a lot of sense. First, the 2.66 GHz Quad has the best price/performance ratio. If you start with the 2.0 GHz, you get 666 MHz more for $300, then you get another 333 MHz for a mere $800. So if you want to upgrade, the 2.66 is _the_ machine to buy. Second, there will be much less difference between a Quad G5 and a Quad Xeon. On performance critical Rosetta applications (like Photoshop) the Quad G5 will be stronger. In that case, it doesn't matter how much stronger - you won't upgrade, that is all that matters. But if you have a dual G5, then the question whether to upgrade or not is really interesting.
And we need to know whether apps use four cores or not. In many cases, changing from two threads to four threads is very easy (that is if all the threads to the same work; it is much harder if the threads do different work), but the app uses only two threads because most machines had only two CPUs. As an example, early versions of Handbrake didn't gain anything from Quad G5s; the CPUs were 50% idle all the time. People complained, and it was changed. The same thing will happen again, especially since _all_ Mac Pros have four cores.
Multimedia
Jul 21, 12:20 PM
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use...
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.I strongly disagree. I could use 16 cores right now for notihng more than simple consumer electronics video compression routines. There will be a Mac Pro with 8 cores this Winter 2007.
You are completely blind to the need for many cores right now for very simple stupid work. All I want to do is run 4 copies of Toast while running 4 copies of Handbrake simultaneously. Each wants 2 cores or more. So you are not thinking of the current need for 16 cores already.
This is not even beginning to discuss how many Final Cut Studio Editors need 16 Cores. Man, I can't believe you wrote that. I think you are overlooking the obvious - the need to run multiple copies of today's applicaitons simultaneously.
So as long as the heat issue can be overcome, I don't see why 8 Cores can't belong inside an iMac by the end of 2008.
I apologize if I read a little hot. But I find the line of thought that 4 or 8 Cores are enough or more than enough to really annoy me. They are not nearly enough for those of us who see the problem of not enough cores EVERY DAY. The rest of you either have no imagination or are only using your Macs for word processing, browsing and email.
I am sincerely frustrated by not having enough cores to do simple stupid work efficiently. Just look at how crippled this G5 Quad is already only running three things. They can't even run full speed due to lack of cores.
happyduck42
Apr 19, 02:12 PM
According to Wikipedia It was released in Feb before the iPhone was released..
Wikipedia is wrong then; it was announced in Feb after the iPhone in January 2007.
http://www.gsmarena.com/samsung_f700-1849.php
Wikipedia is wrong then; it was announced in Feb after the iPhone in January 2007.
http://www.gsmarena.com/samsung_f700-1849.php
Bilbo63
Apr 19, 02:45 PM
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
Thanks for posting that Yamcha. Xerox's engineers were seriously brilliant.
Edit... stripped out the images... no need to show them again. My bad.
Thanks for posting that Yamcha. Xerox's engineers were seriously brilliant.
Edit... stripped out the images... no need to show them again. My bad.
thogs_cave
Jul 27, 10:11 AM
All of the reviews of the Core 2 Duo say that it crushes AMD in the desktop arena. This is good news
This week, anyhow. This stuff goes back-and-forth like a tennis match.
I don't know if it's a good thing or not, it just is. I prefer AMD on the whole, as I like their design philosophy. But, I'm totally happy with the Intel chip in my MacBook. Whatever works. I find as I get older, the same computers get faster while I just get slower. :D
This week, anyhow. This stuff goes back-and-forth like a tennis match.
I don't know if it's a good thing or not, it just is. I prefer AMD on the whole, as I like their design philosophy. But, I'm totally happy with the Intel chip in my MacBook. Whatever works. I find as I get older, the same computers get faster while I just get slower. :D
KnightWRX
Apr 9, 11:12 AM
I thought the 320m was also integrated? Wouldn't that mean that would be your only graphics card were nvidia allowed to add them to sandy bridge? I don't see why you would have integrated intel hd 3000 along with an integrated 320m (or successor).
Why not ? A 320m successor would just destroy the Intel HD 3000 which is sub-par compared to the current 320m. Why not use 2 IGPs and go for a 2 chip solution instead of using a dedicated GPU and have to rely on a 3 chip solution if that 2nd IGP just blows away the first ?
Heck, just disable the Intel 3000 HD entirely.
Intel got greedy.
Why not ? A 320m successor would just destroy the Intel HD 3000 which is sub-par compared to the current 320m. Why not use 2 IGPs and go for a 2 chip solution instead of using a dedicated GPU and have to rely on a 3 chip solution if that 2nd IGP just blows away the first ?
Heck, just disable the Intel 3000 HD entirely.
Intel got greedy.
faroZ06
Apr 8, 12:56 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
Guys Apple is not to blame for this one. Well other than doing business with a sleazy business like Best Buy.
Honestly it has been like eight years since I've entered a Best Buy, everything about the place just feels undesirable and corrupt. The fact that many here are surprised at this non-sense highlights a marginal expectation for ethical behavior. No one really needs to shop at Best Buy, there are plenty of alternatives.
I don't know, I usually go into Best Buy and find stuff at good prices. However, the cables are a ripoff, but most cables are anywhere. I'd still go to Best Buy for some stuff.
Guys Apple is not to blame for this one. Well other than doing business with a sleazy business like Best Buy.
Honestly it has been like eight years since I've entered a Best Buy, everything about the place just feels undesirable and corrupt. The fact that many here are surprised at this non-sense highlights a marginal expectation for ethical behavior. No one really needs to shop at Best Buy, there are plenty of alternatives.
I don't know, I usually go into Best Buy and find stuff at good prices. However, the cables are a ripoff, but most cables are anywhere. I'd still go to Best Buy for some stuff.
toddybody
Apr 6, 02:57 PM
This is like ESPN reporting on a 12min mile time for a Special Olympic Runner...
DrGruv1
Jul 15, 01:43 AM
Maybe along the line of the g4 quicksilver (without the handles)
a nice short compact apple tower with more expansion than the mini and with a conroe for....
$1099
Now you'd be talkin' :)
let people switch out their monitors, etc and give them a nice tower - not the stupid mini :) - i say (stupid mini) only because i wish it was a smallish tower with expansion capabilities :)
a nice short compact apple tower with more expansion than the mini and with a conroe for....
$1099
Now you'd be talkin' :)
let people switch out their monitors, etc and give them a nice tower - not the stupid mini :) - i say (stupid mini) only because i wish it was a smallish tower with expansion capabilities :)
NJRonbo
Jun 14, 06:12 PM
What?!
No white phone?
Can you verify bibbz?
No white phone?
Can you verify bibbz?
PlipPlop
Apr 6, 03:10 PM
Shame people are brainwashed by Apple with their crappy product, and the superior tablet is behind on sales. Im sure it will pick up soon.
dougny
Nov 29, 09:13 AM
Lame. As if they aren't gettign enough money as it is.
They aren't. The entire music business revenues are down 40% since 2001. Sales are down hugely. I can tell you from representing these artists that all the money is down too.
Are you spending as much on music as you did years ago?
They aren't. The entire music business revenues are down 40% since 2001. Sales are down hugely. I can tell you from representing these artists that all the money is down too.
Are you spending as much on music as you did years ago?
iliketyla
Mar 31, 08:21 PM
Has LTD ever posted anything not pro-Apple?
I'll give it to you dude, you're very articulate and you have a way of spinning things to sound like you're right, but you are blatantly against anything that encourages competition or threatens Apple in any way.
I'll give it to you dude, you're very articulate and you have a way of spinning things to sound like you're right, but you are blatantly against anything that encourages competition or threatens Apple in any way.
ليست هناك تعليقات:
إرسال تعليق