NebulaClash
Apr 29, 07:54 AM
A reasonable question, AppleScruff. Indeed, my sample group includes staff, faculty, and students from different disciplines (including business/commerce, and engineering) at a university who use their Macs for research, graduate work, or lecture preparation; a prominent cardiologist at a large hospital; a financial advisor; professional musicians; and many others.
I am myself using a Mac in a business school seamlessly among my PC-using peers. There is nothing that they can do that I cannot - and many things I can do that they would have a difficult time doing in Windows. In fact, my colleagues have been so impressed that one has already made the switch recently, and another is preparing to switch as well. Those days of "needing to run Windows" for work are behind us.
That's been my observation in the business world as well. With projects often being Web-based now, Windows is becoming irrelevant. On one project with about twenty developers, systems architects and analysts, close to half were running Macbook Pros (no Windows installed) and doing very well. It's just not an issue for many office folks. Obviously there are some roles that still require Windows, but not as many as it used to be. The tech folks in particular seem to take great delight in moving to Macs. Times have changed.
I am myself using a Mac in a business school seamlessly among my PC-using peers. There is nothing that they can do that I cannot - and many things I can do that they would have a difficult time doing in Windows. In fact, my colleagues have been so impressed that one has already made the switch recently, and another is preparing to switch as well. Those days of "needing to run Windows" for work are behind us.
That's been my observation in the business world as well. With projects often being Web-based now, Windows is becoming irrelevant. On one project with about twenty developers, systems architects and analysts, close to half were running Macbook Pros (no Windows installed) and doing very well. It's just not an issue for many office folks. Obviously there are some roles that still require Windows, but not as many as it used to be. The tech folks in particular seem to take great delight in moving to Macs. Times have changed.
ChazUK
Apr 28, 07:38 AM
Make up your mind what you want to count iPads as. Damn is it a mobile device a computer. Someone give them a ****ing category already.
I wonder if those people who complain about iPads not being included in smart phone market share will also complain that the iPad is included in pc sales market share?
I wonder if those people who complain about iPads not being included in smart phone market share will also complain that the iPad is included in pc sales market share?
kingtj
Sep 26, 11:23 AM
I think he was probably just trying to say the same thing I've been saying about my new Mac Pro too. The OS and apps need to do some catching-up so we can fully utilize what we paid for!
Right now, the hardware is *way* ahead of the software in most respects. (We're still waiting another year for major apps like MS Office and Adobe Photoshop to go universal binary - much less see them coded for optimal use of a 4 or 8-core machine!)
You could argue that "I should have just bought an iMac." I suppose, but show me an iMac with a graphics card like the ATI X1900XT 512MB in it, or the ability to hold multiple internal hard drives. These are features I expect from any desktop system I use as my primary computer. I also already owned a perfectly good Dell 24" LCD panel, so didn't really want to buy a machine with the display built-in.
Are you trying to say that you spent to much for a computer and should have bought an iMac? What do you do with your computer. Web and email or editing HD video?
Right now, the hardware is *way* ahead of the software in most respects. (We're still waiting another year for major apps like MS Office and Adobe Photoshop to go universal binary - much less see them coded for optimal use of a 4 or 8-core machine!)
You could argue that "I should have just bought an iMac." I suppose, but show me an iMac with a graphics card like the ATI X1900XT 512MB in it, or the ability to hold multiple internal hard drives. These are features I expect from any desktop system I use as my primary computer. I also already owned a perfectly good Dell 24" LCD panel, so didn't really want to buy a machine with the display built-in.
Are you trying to say that you spent to much for a computer and should have bought an iMac? What do you do with your computer. Web and email or editing HD video?
nagromme
Mar 18, 12:54 PM
Anyway, I've never been one to agree with the Windows people that argue the security-by-obscurity for why Mac OS X is not hacked to bits like Windows, but it would seem that this adds aome serious fire to their arguement.
Obscurity IS a factor that helps Mac users. The point is that good, secure design is ALSO a factor. But DRM file distribution doesn't relate to OS security/privacy anyway.
Anyway... you still have to BUY the song to use this hack.
Obscurity IS a factor that helps Mac users. The point is that good, secure design is ALSO a factor. But DRM file distribution doesn't relate to OS security/privacy anyway.
Anyway... you still have to BUY the song to use this hack.
PJWilkinson
Sep 12, 04:25 PM
I've just got back from the live streamed event in London and summarised the key highlights of the show here:
http://blog.crowdstorm.com
I wish I'd had my camera now. I did have a chance to play with all the products (except iTV) and must say the ipods look a lot smaller and the iTunes interface is very slick. iTV was basically a flat apple mini with lots of connectors out the back for the TV - no one could convince us that the 640x480 would be enough for HDTV or which wireless protocol it would use.
http://blog.crowdstorm.com
I wish I'd had my camera now. I did have a chance to play with all the products (except iTV) and must say the ipods look a lot smaller and the iTunes interface is very slick. iTV was basically a flat apple mini with lots of connectors out the back for the TV - no one could convince us that the 640x480 would be enough for HDTV or which wireless protocol it would use.
idea_hamster
May 2, 08:56 AM
So what does this do? What's the effect of the payload?
NathanMuir
Mar 13, 01:19 PM
Japan doesn't really have a choice BUT to build plants on the Pacific Rim, since that's where the country is located.
That, the lack of domestic oil and gas (90% of oil used in electric power is from the Middle East), plus a small highly populated country (rules out big hydropower) and they haven't got many options left. Linky (http://eneken.ieej.or.jp/data/en/data/pdf/433.pdf).
I didn't say that they didn't have the need (though I'm betting that they'll turn to green energy, in larger part, when they begin the rebuilding process; solar, wind, etc...).
I just questioned how well thought out the idea was to build these plants in an area that is highly susceptible to volcanic activity.
That, the lack of domestic oil and gas (90% of oil used in electric power is from the Middle East), plus a small highly populated country (rules out big hydropower) and they haven't got many options left. Linky (http://eneken.ieej.or.jp/data/en/data/pdf/433.pdf).
I didn't say that they didn't have the need (though I'm betting that they'll turn to green energy, in larger part, when they begin the rebuilding process; solar, wind, etc...).
I just questioned how well thought out the idea was to build these plants in an area that is highly susceptible to volcanic activity.
GGJstudios
May 2, 11:36 AM
4. Run a Spotlight search for "MACDefender" to check for any associated files that might still be lingering
That's a sure way *not* to find any rel
ated files.
The only effective method for complete app removal is manual deletion:
Best way to FULLY DELETE a program (http://forums.macrumors.com/showpost.php?p=11171082&postcount;=16)
One thing Macs need anti-virus is to scan mails for Windows viruses, so that those doesn't to you PC. That is all.
That doesn't protect Windows PCs from malware from other sources, which is a far greater threat than receiving files from a Mac. Each Windows user should be running their own anti-virus, to protect them from malware from all sources.
Yes so much. Because Malware can copy itself and infect a computer.
No, only a virus can do that. A trojan requires user involvement to spread.
So few virus for MAC than when one appears it is news... :)
This isn't a virus.
Mac OS X fanboys really need to stop clinging to the mentality that "viruses" don't exist for OS X and that "malware" is a Windows-only problem.
I agree. While no Mac OS X viruses exist at this time, that doesn't mean they won't in the future. And malware has always been a threat. What's important is to understand the kinds of threats and the most effective methods for protection.
The fact is, the days of viruses are long gone.
I wouldn't go so far as to say that. Just when you do, someone will release a new virus into the wild. While they may not be as prevalent as they once were, they're by no means extinct.
The fact is, understanding the proper terminology and different payloads and impacts of the different types of malware prevents unnecessary panic and promotes a proper security strategy.
I'd say it's people that try to just lump all malware together in the same category, making a trojan that relies on social engineering sound as bad as a self-replicating worm that spreads using a remote execution/privilege escalation bug that are quite ignorant of general computer security.
The best defense a Mac user has against current malware threats is education and common sense. Understanding the basic differences between a virus, trojan, worm, and other types of malware will help a user defend against them. Doing simple things like unchecking the "Open "safe" files after downloading" option is quite effective.
I despise the "X is a file downloaded from the Internet" dialog introduced in SL. Really wish you could disable it.
That's one of the simple lines of defense for a user, as it lets them know they're about to open a newly-downloaded app. It only does that the first time you launch the app, so why bother disabling such a helpful reminder?
To the end user it makes no difference. It's fine if you know, but to a novice quickly correcting them on the difference between a virus, a trojan, or whatever else contributes approximately zero percent towards solving the problem.
Actually, it helps a user to have some understanding about malware. Part of the problem is a novice user is likely to engage in dangerous activities, such as installing pirated software, unless they know what a trojan is and how it infects a system. Also, understanding what a virus is, how it spreads, and the fact that none exist for Mac OS X will prevent them from instantly assuming that everything unexpected that happens on their Mac is the result of a virus. Also, understanding that antivirus apps can't detect a virus that doesn't yet exist will prevent them from installing AV and having a false sense of security, thinking they're immune to threats. Educating a user goes a very long way in protecting them, by teaching them to practice safe computing habits.
Mac Virus/Malware Info (http://forums.macrumors.com/showpost.php?p=9400648&postcount;=4)
That's a sure way *not* to find any rel
ated files.
The only effective method for complete app removal is manual deletion:
Best way to FULLY DELETE a program (http://forums.macrumors.com/showpost.php?p=11171082&postcount;=16)
One thing Macs need anti-virus is to scan mails for Windows viruses, so that those doesn't to you PC. That is all.
That doesn't protect Windows PCs from malware from other sources, which is a far greater threat than receiving files from a Mac. Each Windows user should be running their own anti-virus, to protect them from malware from all sources.
Yes so much. Because Malware can copy itself and infect a computer.
No, only a virus can do that. A trojan requires user involvement to spread.
So few virus for MAC than when one appears it is news... :)
This isn't a virus.
Mac OS X fanboys really need to stop clinging to the mentality that "viruses" don't exist for OS X and that "malware" is a Windows-only problem.
I agree. While no Mac OS X viruses exist at this time, that doesn't mean they won't in the future. And malware has always been a threat. What's important is to understand the kinds of threats and the most effective methods for protection.
The fact is, the days of viruses are long gone.
I wouldn't go so far as to say that. Just when you do, someone will release a new virus into the wild. While they may not be as prevalent as they once were, they're by no means extinct.
The fact is, understanding the proper terminology and different payloads and impacts of the different types of malware prevents unnecessary panic and promotes a proper security strategy.
I'd say it's people that try to just lump all malware together in the same category, making a trojan that relies on social engineering sound as bad as a self-replicating worm that spreads using a remote execution/privilege escalation bug that are quite ignorant of general computer security.
The best defense a Mac user has against current malware threats is education and common sense. Understanding the basic differences between a virus, trojan, worm, and other types of malware will help a user defend against them. Doing simple things like unchecking the "Open "safe" files after downloading" option is quite effective.
I despise the "X is a file downloaded from the Internet" dialog introduced in SL. Really wish you could disable it.
That's one of the simple lines of defense for a user, as it lets them know they're about to open a newly-downloaded app. It only does that the first time you launch the app, so why bother disabling such a helpful reminder?
To the end user it makes no difference. It's fine if you know, but to a novice quickly correcting them on the difference between a virus, a trojan, or whatever else contributes approximately zero percent towards solving the problem.
Actually, it helps a user to have some understanding about malware. Part of the problem is a novice user is likely to engage in dangerous activities, such as installing pirated software, unless they know what a trojan is and how it infects a system. Also, understanding what a virus is, how it spreads, and the fact that none exist for Mac OS X will prevent them from instantly assuming that everything unexpected that happens on their Mac is the result of a virus. Also, understanding that antivirus apps can't detect a virus that doesn't yet exist will prevent them from installing AV and having a false sense of security, thinking they're immune to threats. Educating a user goes a very long way in protecting them, by teaching them to practice safe computing habits.
Mac Virus/Malware Info (http://forums.macrumors.com/showpost.php?p=9400648&postcount;=4)
Tarzanman
Mar 18, 08:45 AM
Some of the responses on this thread are really amusing.
The people who are defending AT&T;'s actions are either astroturfing shills, or dolts.
Here's a newsflash: Just because you put something into a contract doesn't make it legal or make it fair. What if AT&T; stipulated that they were allowed to come by your house and give you a wedgie every time you checked your voicemail...? Would you still be screaming about how its "justified" because its written on some lop-sided, legalese-ridden piece of paper?
The way that the current data plans are priced and more importantly *marketed* to customers, charging for tethering is double charging for data.
The correct thing to do would be to have multiple (at least 3) tiers of data and stop differentiating between tethered service. If the tetherers are using too much data then charge them appropriately. What AT&T; is currently doing is telling you that you can use up to 2GB of data, and then trying to charge you extra when they see that you might actually use that much (due to tethering).
The people who are defending AT&T;'s actions are either astroturfing shills, or dolts.
Here's a newsflash: Just because you put something into a contract doesn't make it legal or make it fair. What if AT&T; stipulated that they were allowed to come by your house and give you a wedgie every time you checked your voicemail...? Would you still be screaming about how its "justified" because its written on some lop-sided, legalese-ridden piece of paper?
The way that the current data plans are priced and more importantly *marketed* to customers, charging for tethering is double charging for data.
The correct thing to do would be to have multiple (at least 3) tiers of data and stop differentiating between tethered service. If the tetherers are using too much data then charge them appropriately. What AT&T; is currently doing is telling you that you can use up to 2GB of data, and then trying to charge you extra when they see that you might actually use that much (due to tethering).
hunkaburningluv
Apr 9, 06:04 AM
Apple will buy Nintendo eventually.
It's over for Nintendo.
Get ready for the iwii
I doubt it - ninty are make some serious money on every console/handheld unit sold, they are set for the foreseeable future. IMO, while the iOS is great for short bursts of gaming, it will never replace a dedicated gaming machine
Doesn't matter. Apple took in two head gaming executives. Whether they called them up or were called up, they now have major gaming players in their family. It's a pretty clear sign that they will be getting into gaming in some way.
when they get Miyamoto or Iwata, then I'll be interested
These people are fleeing the "yellow light of death” on PS3 or "red ring of death' on 360. The consoles are so poorly made that broken PS3's seldomly fetch $50 on eBay.
Apple has a real opportunity to make a name in gaming as gamers know quality and appreciate being taken seriously.
that's well, ********, to be honest, RROD has pretty much been eliminated and YLOD wasn't particularly widespread....
It's over for Nintendo.
Get ready for the iwii
I doubt it - ninty are make some serious money on every console/handheld unit sold, they are set for the foreseeable future. IMO, while the iOS is great for short bursts of gaming, it will never replace a dedicated gaming machine
Doesn't matter. Apple took in two head gaming executives. Whether they called them up or were called up, they now have major gaming players in their family. It's a pretty clear sign that they will be getting into gaming in some way.
when they get Miyamoto or Iwata, then I'll be interested
These people are fleeing the "yellow light of death” on PS3 or "red ring of death' on 360. The consoles are so poorly made that broken PS3's seldomly fetch $50 on eBay.
Apple has a real opportunity to make a name in gaming as gamers know quality and appreciate being taken seriously.
that's well, ********, to be honest, RROD has pretty much been eliminated and YLOD wasn't particularly widespread....
Multimedia
Oct 21, 10:03 AM
lmao and just to add, DAMN that is alot of coresNot if you use applications that are Core Hogs. Compressing video with Toast uses up to 4 cores per instance. Compressing video with Handbrake uses up to 3 cores per instance. So, no, it's not a lot of cores at all and I will be buying a 16 core then a 32 core Mac Pro the day they ship as well. :eek:
peharri
Sep 23, 10:25 AM
Perhaps we've just been exposed to different sources of info. I viewed the sept 12 presentation in its entirety, and have read virtually all the reports and comments on macrumors, appleinsider, think secret, engadget, the wall street journal, and maccentral, among others. It was disney chief bob iger who was quoted saying iTV had a hard drive; that was generally interpreted (except by maccentral, which took the statement literally) to mean it had some sort of storage, be it flash or a small HD, and that it would be for buffering/caching to allow streaming of huge files at relatively slow (for the purpose) wireless speeds.
I've read absolutely everything I can too and I have to disagree with you still.
It makes absolutely no sense for Bob Iger to have been told there's "some sort of storage" if this isn't storage in any conventional sense. Storage to a layman means somewhere where you store things, not something transitory used by the machine in a way you can't fathom. So, we have two factors here:
First - Bob's been talking about a hard disk. That absolutely doesn't point at a cache, it's too expensive to be a cache.
Second - Even if Bob got the technology wrong, he's been told the machine has "storage". That's not a term you generally use to mean "transitory storage for temporary objects".
The suggestion Bob's talking about a cache is being made, in my view, because people know it'll need some sort of caching to overcome 802.11/etc temporary bandwidth issues (Hmm. Kind of. You guys do know we're talking about way less bandwidth requirements than a DVD right - and that a DVD-formatted MPEG2 will transmit realtime on an 802.11g link? What's more, for 99% of Internet users, their DSL connection has less bandwidth than their wireless link, even if they're on the other side of the house with someone else's WAN in range and on the same channel. Yes, 802.11 suffers drop-outs, but we're talking about needing seconds worth of video effected, not hours) As such, you're trying to find evidence that it'll deal with caching.
YOU DON'T NEED TO. A few megabytes of RAM is enough to ensure smooth playback will happen. This is a non-problem. Everyone who's going this route is putting way too much thought into designing a solution to something that isn't hard to solve.
Nonetheless, because it's an "issue", everything is being interpreted in that light. If there's "storage", it must be because of caching! Well, in my opinion, if there's storage, it's almost certainly to do with storage. You don't need it for caching.
I'm trying to imagine a conversation with Bob Iger where the issue of flash or hard disk space for caching content to avoid 802.11 issues would come up, and where the word "storage" would be used purely in that context. It's hard. I don't see them talking about caches to Iger. It makes no sense. They might just as well talk about DCT transforms or the Quicktime API.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
Sure. I'm perfectly willing to be wrong too. I'm certainly less sure of it than I am of the iPhone rumours being bunk.
Regardless of the truth, I have to say the iTV makes little sense unless, regardless of whether it contains a hard disk or not, it can stream content directly from the iTS. Without the possibility of being used as a computer-less media hub, it becomes an overly expensive and complicated solution for what could more easily be done by making a bolt-on similar to that awful TubePort concept.
I'm 99% sure the machine is intended as an independent hub that can use iTunes libraries on the same network but can also go to the iTS directly and view content straight from there (and possibly other sources, such as Google Video.) I can see why Apple would make that. I can see why it would take a $300 machine to do that and make it practical. I see the importance of the iTS and the potential dangers to it as the cellphone displaces the iPod, and Apple's need to shore it up. I can see studio executives "not getting it" with online movies if those movies can only be seen on laptops, PCs, and iPods.
If Apple does force the thing to need a computer, I think they need to come out with an 'iTunes server' box that can fufill the same role, and it has to be cheap.
I've read absolutely everything I can too and I have to disagree with you still.
It makes absolutely no sense for Bob Iger to have been told there's "some sort of storage" if this isn't storage in any conventional sense. Storage to a layman means somewhere where you store things, not something transitory used by the machine in a way you can't fathom. So, we have two factors here:
First - Bob's been talking about a hard disk. That absolutely doesn't point at a cache, it's too expensive to be a cache.
Second - Even if Bob got the technology wrong, he's been told the machine has "storage". That's not a term you generally use to mean "transitory storage for temporary objects".
The suggestion Bob's talking about a cache is being made, in my view, because people know it'll need some sort of caching to overcome 802.11/etc temporary bandwidth issues (Hmm. Kind of. You guys do know we're talking about way less bandwidth requirements than a DVD right - and that a DVD-formatted MPEG2 will transmit realtime on an 802.11g link? What's more, for 99% of Internet users, their DSL connection has less bandwidth than their wireless link, even if they're on the other side of the house with someone else's WAN in range and on the same channel. Yes, 802.11 suffers drop-outs, but we're talking about needing seconds worth of video effected, not hours) As such, you're trying to find evidence that it'll deal with caching.
YOU DON'T NEED TO. A few megabytes of RAM is enough to ensure smooth playback will happen. This is a non-problem. Everyone who's going this route is putting way too much thought into designing a solution to something that isn't hard to solve.
Nonetheless, because it's an "issue", everything is being interpreted in that light. If there's "storage", it must be because of caching! Well, in my opinion, if there's storage, it's almost certainly to do with storage. You don't need it for caching.
I'm trying to imagine a conversation with Bob Iger where the issue of flash or hard disk space for caching content to avoid 802.11 issues would come up, and where the word "storage" would be used purely in that context. It's hard. I don't see them talking about caches to Iger. It makes no sense. They might just as well talk about DCT transforms or the Quicktime API.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
Sure. I'm perfectly willing to be wrong too. I'm certainly less sure of it than I am of the iPhone rumours being bunk.
Regardless of the truth, I have to say the iTV makes little sense unless, regardless of whether it contains a hard disk or not, it can stream content directly from the iTS. Without the possibility of being used as a computer-less media hub, it becomes an overly expensive and complicated solution for what could more easily be done by making a bolt-on similar to that awful TubePort concept.
I'm 99% sure the machine is intended as an independent hub that can use iTunes libraries on the same network but can also go to the iTS directly and view content straight from there (and possibly other sources, such as Google Video.) I can see why Apple would make that. I can see why it would take a $300 machine to do that and make it practical. I see the importance of the iTS and the potential dangers to it as the cellphone displaces the iPod, and Apple's need to shore it up. I can see studio executives "not getting it" with online movies if those movies can only be seen on laptops, PCs, and iPods.
If Apple does force the thing to need a computer, I think they need to come out with an 'iTunes server' box that can fufill the same role, and it has to be cheap.
WilliamG
May 30, 09:59 PM
I drop so many calls on AT&T; 3G that it's a joke. I drop basically no calls on AT&T; EDGE. Seattle, here.
jobesucks
Apr 15, 04:03 PM
No resolution independance sucks on mac, but think im right in saying lion will fix that.
Also mac networking sucks, pc,s rarely show in finder, sometimes do sometimes dont, have to cmd k far too often, well in my experience anyway.
Other than that not much else
Also mac networking sucks, pc,s rarely show in finder, sometimes do sometimes dont, have to cmd k far too often, well in my experience anyway.
Other than that not much else
bugfaceuk
Apr 9, 10:42 AM
Are PR people not supposed to stop everyone hating you?
Hang on. Let me just parse the negatives in that sentence.
"Aren't PR people supposed to make everyone like you"
Right that's better.
Yes they are...
Hang on. Let me just parse the negatives in that sentence.
"Aren't PR people supposed to make everyone like you"
Right that's better.
Yes they are...
Shivetya
Apr 28, 12:29 PM
Its not like the market for $1000+ computers is inexhaustible. They had to throw in tablets while they can to maintain market position because once the cheap tablets start coming out (and they will, it took a while for notebooks to get cheap and look at where they are now).
SMM
Oct 21, 12:52 PM
It will come, just not with the initial
production models. With the quad-core chips, Intel is already running into FSB bandwidth issues as it is. The Clovertowns are essentially dual Woodcrest CPUs stuck on the same die, sharing the same FSB and communication between the first duo-core CPU and the second duo-core CPU on that die must travel onto the FSB and into the other CPU. Between the two cores that are linked directly, data sharing can be handled through the L1 cache. So, depending on your application, the 8-core may be no better than a 4-core system -- if what your'e doing is already maxing out your CPU bus bandwidth. Somwhere down the road as Intel shifts to its 45nm production process and fully integrates all 4 cores on a single CPU (and later, 8 cores on die), we will see massive improvements in inter-core bandwidth. They will have to step-up on the FSB bandwidth though... Possibly by increasing the MHz, but more than likely we'll see some of that combined with increasing the width of the data path and possibly using multiple parallel FSB designs. ...Going to be interesting, that's for sure. And with Intel's new process and the plans for continuously jamming more cores onto a die at higher speeds, I think we're in for a real ride over the next 5 years or so.
Absolutely agree. It must be exciting to be an EE working on this stuff right now. So many options to explore. How would you design a memory bus which would be dynamic enough to adjust for a doubling of processors? If you had a fixed, known number of processors, the design is straight-forward. But, the new multi-core design is not something they have had to deal with before. I wonder how they will do it?
production models. With the quad-core chips, Intel is already running into FSB bandwidth issues as it is. The Clovertowns are essentially dual Woodcrest CPUs stuck on the same die, sharing the same FSB and communication between the first duo-core CPU and the second duo-core CPU on that die must travel onto the FSB and into the other CPU. Between the two cores that are linked directly, data sharing can be handled through the L1 cache. So, depending on your application, the 8-core may be no better than a 4-core system -- if what your'e doing is already maxing out your CPU bus bandwidth. Somwhere down the road as Intel shifts to its 45nm production process and fully integrates all 4 cores on a single CPU (and later, 8 cores on die), we will see massive improvements in inter-core bandwidth. They will have to step-up on the FSB bandwidth though... Possibly by increasing the MHz, but more than likely we'll see some of that combined with increasing the width of the data path and possibly using multiple parallel FSB designs. ...Going to be interesting, that's for sure. And with Intel's new process and the plans for continuously jamming more cores onto a die at higher speeds, I think we're in for a real ride over the next 5 years or so.
Absolutely agree. It must be exciting to be an EE working on this stuff right now. So many options to explore. How would you design a memory bus which would be dynamic enough to adjust for a doubling of processors? If you had a fixed, known number of processors, the design is straight-forward. But, the new multi-core design is not something they have had to deal with before. I wonder how they will do it?
auero
Mar 18, 07:59 AM
I don't understand the ranting of why AT&T; charges more to tether. Sprint and Verizon do it too? Just because your jailbreak method doesn't work anymore shouldn't make you mad. The system caught up to you. Yes it's stupid to pay for extra data but that's just how it is and people are still going to pay for it so complaining won't do anything.
I'm glad those people who are abusing the service and using 6+ gb of data so they can tether are finally getting the boot. It bogs down the network. Unlimited doesn't mean unlimited in the fine print either. It's the same on every network so don't blame AT&T.;
I'm glad those people who are abusing the service and using 6+ gb of data so they can tether are finally getting the boot. It bogs down the network. Unlimited doesn't mean unlimited in the fine print either. It's the same on every network so don't blame AT&T.;
brianus
Sep 27, 08:44 AM
Yes, Intel will be shipping Clovertowns then - but when will Apple get around to putting them in systems? (November - well, that can wait for The Lord God Jobs' keynote in January, for sure.)
Most vendors are putting Merom systems in their customers' hands, but Apple is still shipping Yonahs in the MacIntelBooks.
I'm at IDF at Moscone, and most of the booths have Kentsfield or Clovertown systems running. (Apple isn't in the hall.)
I think that you're being very brave in assuming that Apple will ship quads in systems when Intel releases them...
Not to mention the fact that they waited a month and a half after Woodcrest was released to announce the Mac Pro and Intel XServes -- based not on Intel processor release schedules but on Mac conference schedules. Then again, this is just a "core bump", rather than a truly new product or chip; IIRC the Quad G5 followed fairly soon after the dual-core G5 processors were announced. Then again AGAIN, the XServes won't even be available 'till October; would they really update them again one or two months later?
Most vendors are putting Merom systems in their customers' hands, but Apple is still shipping Yonahs in the MacIntelBooks.
I'm at IDF at Moscone, and most of the booths have Kentsfield or Clovertown systems running. (Apple isn't in the hall.)
I think that you're being very brave in assuming that Apple will ship quads in systems when Intel releases them...
Not to mention the fact that they waited a month and a half after Woodcrest was released to announce the Mac Pro and Intel XServes -- based not on Intel processor release schedules but on Mac conference schedules. Then again, this is just a "core bump", rather than a truly new product or chip; IIRC the Quad G5 followed fairly soon after the dual-core G5 processors were announced. Then again AGAIN, the XServes won't even be available 'till October; would they really update them again one or two months later?
Huntn
Mar 13, 05:53 PM
It's the cleanest and usually the safest type of electricity available that can produce energy on a large scale.
When there are no accidents it is a good source of power except for the incredibly toxic waste. Murphy's Law says there must be accidents and unforeseen events.
There are inherent risks with nuclear power and there is the waste issue yet to be solved. But likewise, there are risks for other types of power, whether it's gas, oil, coal or even hydroelectric. Choose your poison.
Speaking of poison- ten thousand barrels of radioactive waste with a half life of 1000 years... Who gets to keep that in their backyard? I'd say launch it into space, but then have visions of a rocket malfunction requiring explosive detonation.
Granted in the history of nuke power, there has only been one worse case scenarios, but that one was a doozy. Sure they say it can never happen but when a coal fired plant blows up it does not contaminate 4000 square miles. This makes nuclear power both wonderful and terrifying at the same time, because we all know accidents must happen. The question is how long and how big will the worst of those accidents be? Personally I'd look for other green not yellow solutions.
http://www.scienceprogress.org/wp-content/uploads/2007/12/radioactive_symbol_250.jpg
I've read in Russia, there are areas with posted signs that say something to the effect of "Roll Up Your Windows and Drive as Fast as You can for the Next 50 miles"... Read about Chernobyl here (http://en.wikipedia.org/wiki/Chernobyl_disaster_effects).
http://upload.wikimedia.org/wikipedia/commons/thumb/2/23/Chernobyl_radiation_map_1996.svg/400px-Chernobyl_radiation_map_1996.svg.png
When there are no accidents it is a good source of power except for the incredibly toxic waste. Murphy's Law says there must be accidents and unforeseen events.
There are inherent risks with nuclear power and there is the waste issue yet to be solved. But likewise, there are risks for other types of power, whether it's gas, oil, coal or even hydroelectric. Choose your poison.
Speaking of poison- ten thousand barrels of radioactive waste with a half life of 1000 years... Who gets to keep that in their backyard? I'd say launch it into space, but then have visions of a rocket malfunction requiring explosive detonation.
Granted in the history of nuke power, there has only been one worse case scenarios, but that one was a doozy. Sure they say it can never happen but when a coal fired plant blows up it does not contaminate 4000 square miles. This makes nuclear power both wonderful and terrifying at the same time, because we all know accidents must happen. The question is how long and how big will the worst of those accidents be? Personally I'd look for other green not yellow solutions.
http://www.scienceprogress.org/wp-content/uploads/2007/12/radioactive_symbol_250.jpg
I've read in Russia, there are areas with posted signs that say something to the effect of "Roll Up Your Windows and Drive as Fast as You can for the Next 50 miles"... Read about Chernobyl here (http://en.wikipedia.org/wiki/Chernobyl_disaster_effects).
http://upload.wikimedia.org/wikipedia/commons/thumb/2/23/Chernobyl_radiation_map_1996.svg/400px-Chernobyl_radiation_map_1996.svg.png
takao
Mar 13, 04:04 PM
All we can decide is whether we build them ourselves. We have a very real fuel crisis that manifests itself in war and terrorism, and will only get worse.
really ?
i live in a country which isn't at war .. and hasn't since quite a few years.. and by years i mean decades
and the nuclear power plant we built was stopped before getting turned on by a popular vote (since then we have a constitutional law forbidding to build nuclear power plants...)
wow look at how i am suffering from the terrible consequences
really ?
i live in a country which isn't at war .. and hasn't since quite a few years.. and by years i mean decades
and the nuclear power plant we built was stopped before getting turned on by a popular vote (since then we have a constitutional law forbidding to build nuclear power plants...)
wow look at how i am suffering from the terrible consequences
Lesser Evets
Apr 28, 07:27 AM
188% growth... that's impressive.
tveric
Mar 18, 11:53 PM
So, basically if you use PyMusique you are in violation of the TOS and because you need an iTunes account to even make use of PyMusique, Apple will know who is trying to violate the TOS.
Thus, as I said before, you'd have to be pretty stupid to even try and use this software.
Well, 18 hours later, here we are, I used a Pepsi cap song to download thru PyMusique, it plays perfectly and all that, and so far my account hasn't been cancelled. You know why? Because it JUST ISN'T WORTH THE FRIGGIN EFFORT on Apple's part to start cancelling accounts for using this software. They have to come up with a block to PyM anyway, and that will solve all their problems.
As for v
iolation of the TOS, nobody gives a rip except people who were hall monitors in high school. And as for being stupid, well, maybe some of us just like our freedom without limits. You can attack us for being "stupid" all you want, but that doesn't necessarily make it the truth. Get used to it - DRM is a paper tiger. I buy music thru iTMS, I buy music on CD, I buy it at allofmp3.com for a dollar an album, and I download for free too. No amount of DRM is going to make me change my habits. Only differences in prices and convenience will make me shift from one method to another when required.
Thus, as I said before, you'd have to be pretty stupid to even try and use this software.
Well, 18 hours later, here we are, I used a Pepsi cap song to download thru PyMusique, it plays perfectly and all that, and so far my account hasn't been cancelled. You know why? Because it JUST ISN'T WORTH THE FRIGGIN EFFORT on Apple's part to start cancelling accounts for using this software. They have to come up with a block to PyM anyway, and that will solve all their problems.
As for v
iolation of the TOS, nobody gives a rip except people who were hall monitors in high school. And as for being stupid, well, maybe some of us just like our freedom without limits. You can attack us for being "stupid" all you want, but that doesn't necessarily make it the truth. Get used to it - DRM is a paper tiger. I buy music thru iTMS, I buy music on CD, I buy it at allofmp3.com for a dollar an album, and I download for free too. No amount of DRM is going to make me change my habits. Only differences in prices and convenience will make me shift from one method to another when required.
skunk
Mar 25, 11:14 AM
As marriage is licensed by the state, it is in fact a privilege. The fact that it is near-universally granted doesn't make it any more a right.The fact that something is licensed does not change it from a right to a privilege.
Article 16 of the Universal Declaration of Human Rights declares that "Men and women of full age, without any limitation due to race, nationality or religion, have the right to marry and to found a family. They are entitled to equal rights as to marriage, during marriage and at its dissolution. Marriage shall be entered into only with the free and full consent of the intending spouses."
For most of Western history, marriage was a private contract between two families. Until the 16th-century, Christian churches accepted the validity of a marriage on the basis of a couple’s declarations. If two people claimed that they had exchanged marital vows—even without witnesses—the Catholic Church accepted that they were validly married.
State courts in the United States* have routinely held that public cohabitation was sufficient evidence of a valid marriage.
Article 16 of the Universal Declaration of Human Rights declares that "Men and women of full age, without any limitation due to race, nationality or religion, have the right to marry and to found a family. They are entitled to equal rights as to marriage, during marriage and at its dissolution. Marriage shall be entered into only with the free and full consent of the intending spouses."
For most of Western history, marriage was a private contract between two families. Until the 16th-century, Christian churches accepted the validity of a marriage on the basis of a couple’s declarations. If two people claimed that they had exchanged marital vows—even without witnesses—the Catholic Church accepted that they were validly married.
State courts in the United States* have routinely held that public cohabitation was sufficient evidence of a valid marriage.