[url=http://meincmagazine.com/civis/viewtopic.php?p=25005497#p25005497:1si1uo5s said:Musafir_86[/url]":1si1uo5s]-Hmm, so how about what Intel's (allegedly) doing? http://www.eetimes.com/author.asp?secti ... id=1318894 (via http://legitreviews.com/news/15752/).
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005885#p25005885:3ul2dcfl said:truepusk[/url]":3ul2dcfl]
Best Buy probably isn't the broader market anymore. I don't think many analysts think it will be long before they go the way of Circuit City and those who don't are probably betting on Best Buy to be able to make major changes to adapt.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005973#p25005973:2v5kujsw said:therealankit[/url]":2v5kujsw]
Also, if some benchmarking applications make the device clock higher, isn't it possible if developers make applications that makes the device faster all the time, to get the best usage. That will be good for the users.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005475#p25005475:3u16mcz5 said:Firehawke[/url]":3u16mcz5]ATi and nVidia were both caught in the past doing this with video card benchmarks, too. I particularly remember one incident involving Quake 3 benchmark optimizations about twelve years ago where ATi was doing some specific optimizations that would only trigger when it detected the calling application was "quake3.exe"
Made it really easy to prove, too. Just rename the EXE and you'd see performance drop considerably.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006027#p25006027:o04w5gcw said:f0xik[/url]"04w5gcw]
nVidia lists in their driver release notes for which games they improved the performance. You can also edit the application profiles through their control panel. I believe that's transparent enough.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005785#p25005785:kncb9fex said:truepusk[/url]":kncb9fex][url=http://meincmagazine.com/civis/viewtopic.php?p=25005401#p25005401:kncb9fex said:ShlomoAbraham[/url]":kncb9fex]Any theories as to why? If the chip is there, why not let other apps use it? Save battery life?
Yes. Battery life.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005509#p25005509:1zcv48i0 said:deathBOB[/url]":1zcv48i0]Boy I sure don't care about mobile benchmarks.
http://techreport.com/review/3089/how-a ... -quake-iii[url=http://meincmagazine.com/civis/viewtopic.php?p=25005707#p25005707:2ewlpn0n said:Tyler X. Durden[/url]":2ewlpn0n]If it is the incident I'm think of it I'm pretty sure it involved pre-loading and maintaining in cache certain textures that showed up in a demo sequence in those games that was commonly used for benchmarking (because it was a consistent use and exercising of the game engine that could be duplicated over and over).[url=http://meincmagazine.com/civis/viewtopic.php?p=25005645#p25005645:2ewlpn0n said:F22Rapture[/url]":2ewlpn0n][url=http://meincmagazine.com/civis/viewtopic.php?p=25005475#p25005475:2ewlpn0n said:Firehawke[/url]":2ewlpn0n]ATi and nVidia were both caught in the past doing this with video card benchmarks, too. I particularly remember one incident involving Quake 3 benchmark optimizations about twelve years ago where ATi was doing some specific optimizations that would only trigger when it detected the calling application was "quake3.exe"
Made it really easy to prove, too. Just rename the EXE and you'd see performance drop considerably.
Was that specific to the benchmark itself though? Making game-specific optimizations isn't really cheating. Cheating would be putting in extra effort to make the benchmark run better than the game does (optimizing a JS engine for Sunspider), or unlocking extra hardware functions not available to anything else (as Samsung seems to have done), or some other hack which misleads the benchmark (such as ATi reducing default image quality and accuracy to get a higher FPS).
So the optimizations would really only work for that particular map and to a certain extent that path.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005575#p25005575:1kdas9dl said:sd4f[/url]":1kdas9dl]This in a way reinforces my gripe with the mobile market. Companies aren't as interested in making excellent consumer products as they are making sales.
I suppose it's a chicken and egg thing except making a good phone which will build good will is a lot slower than attacking the spec sheet and getting immediate sales.
I think samsung have demonstrated that they prefer the latter; win the contest on tech specs, make the sales and abandon the phone when the next one nears release.
You must buy from a different industry than I do.[url=http://meincmagazine.com/civis/viewtopic.php?p=25005409#p25005409:9skrmgrc said:lowlymarine[/url]":9skrmgrc]This is, regrettably, the way of the industry now.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006363#p25006363:3etrtvz4 said:Cerberus™[/url]":3etrtvz4]I'm still interested to hear an explanation of why Samsung doesn't allow this high 533 MHz for all demanding applications, instead of only for these benchmarks. If the phone is technically capable of running at 533 MHz occasionally, how does it help Samsung to throttle it? There has to be an economic theory that can truly explain this.
I don't think it can be the battery life, because it would only be triggered when a game demands it, just like any regular CPU governor. Or am I mistaken?
Is it that using this high frequency for a prolonged time overheats the device or makes it unstable? Then they can just put a time limit on it, throttle it automatically when it reaches a certain temperature. Again, I believe this is how standard CPY governors work, isn't it?
The only situation where it would make sense to limit this frequency to benchmarks is if a short period of 533 MHz were somehow a *scarce resource* that is "used up". How is that possible? The only thing I can think of is that *any* period of 533 MHz slightly damages the CPU or has a slight chance to damage it, so that the total time the CPU can run at that speed is very limited; if, say, they have calculated that 3 hours at 533 MHz reduces the average life time of the CPU by 20 %, or something, you have a reason to limit it not just within a short period, but over the entire lifetime of the phone. If you let people use 533 MHz at will, they kill their phone; if they only use it a couple of times for benchmarks, there is little damage, and you profit from the misrepresentation of what the phone is actually capable of.
Does that make sense at all?
Following on their astroturf campaign, I think Samsung has now earned a “guilty until proven innocent” approach that your idea suggests, but I'd rather have Samsung's Director of Engineering describe why this arrangement was deemed in the customers' best interest.[url=http://meincmagazine.com/civis/viewtopic.php?p=25006189#p25006189:3iwx8tcy said:matthewslyman[/url]":3iwx8tcy]Would this be grounds to retroactively reduce all of Samsung's recent history of benchmarking results by a factor of 533/480?
Huh, “they all do it,” eh?[url=http://meincmagazine.com/civis/viewtopic.php?p=25006329#p25006329:36rwr8sh said:Dadlyedly[/url]":36rwr8sh][url=http://meincmagazine.com/civis/viewtopic.php?p=25005575#p25005575:36rwr8sh said:sd4f[/url]":36rwr8sh]This in a way reinforces my gripe with the mobile market. Companies aren't as interested in making excellent consumer products as they are making sales.
I suppose it's a chicken and egg thing except making a good phone which will build good will is a lot slower than attacking the spec sheet and getting immediate sales.
I think samsung have demonstrated that they prefer the latter; win the contest on tech specs, make the sales and abandon the phone when the next one nears release.
This is what we call "modern capitalism." Ever since the theory that CEOs and Boards are only obliged to keep stock prices high for stockholders, this is what happens. Those that realize they have obligations to their customers, employees, and the company itself do better in the long run, but who pays attention to anything beyond the end of the quarter anyway?
[url=http://meincmagazine.com/civis/viewtopic.php?p=25005475#p25005475:3nutmr35 said:Firehawke[/url]":3nutmr35]ATi and nVidia were both caught in the past doing this with video card benchmarks, too. I particularly remember one incident involving Quake 3 benchmark optimizations about twelve years ago where ATi was doing some specific optimizations that would only trigger when it detected the calling application was "quake3.exe"
Made it really easy to prove, too. Just rename the EXE and you'd see performance drop considerably.
That could be claimed to be a smart compiler. But then the Intel compiler have in the past produced code that would drop back to 386 instructions if the CPU didn't id itself as "Genuine Intel"...[url=http://meincmagazine.com/civis/viewtopic.php?p=25005497#p25005497:at6v6vko said:Musafir_86[/url]":at6v6vko]-Hmm, so how about what Intel's (allegedly) doing? http://www.eetimes.com/author.asp?secti ... id=1318894 (via http://legitreviews.com/news/15752/).
Regards.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006231#p25006231:3jy8ll70 said:Grimmash[/url]":3jy8ll70][url=http://meincmagazine.com/civis/viewtopic.php?p=25005509#p25005509:3jy8ll70 said:deathBOB[/url]":3jy8ll70]Boy I sure don't care about mobile benchmarks.
To play devil's advocate, mobile computing is still an area where performance is still something that could be a factor, depending on application. For desktops, most people can afford a computer that far exceeds the demands of actually critical software. While Ars might be a hotbed of people tweaking to get Crysis or some similar application running at 120 FPS or something, mobile devices can still fail to provide the proper sweet spot of speed and battery life. I know the Macbook Air I am typing this on is woefully unable to do many things my desktop, a full two years older, can just shrug off.
Cheating mobile benchmarking could have actual consequences when mobile phones can cost as much as a video card with contract, and a video card + ram + something else without contract.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006577#p25006577:bzy3so3x said:hobgoblin[/url]":bzy3so3x]That could be claimed to be a smart compiler. But then the Intel compiler have in the past produced code that would drop back to 386 instructions if the CPU didn't id itself as "Genuine Intel"...[url=http://meincmagazine.com/civis/viewtopic.php?p=25005497#p25005497:bzy3so3x said:Musafir_86[/url]":bzy3so3x]-Hmm, so how about what Intel's (allegedly) doing? http://www.eetimes.com/author.asp?secti ... id=1318894 (via http://legitreviews.com/news/15752/).
Regards.
At this point, the only compiler i would trust to produce impartial binary is GCC.
Intel Compiler 11 does something miraculous. It interchanges the two loops, thereby hoisting the unpredictable branch to the outer loop. So not only is it immune the mispredictions, it is also twice as fast as whatever VC++ and GCC can generate! In other words, ICC took advantage of the test-loop to defeat the benchmark...
At least one journalist did call Ferrari out for this : he is now banned from testing their cars.[url=http://meincmagazine.com/civis/viewtopic.php?p=25005579#p25005579:3ifmz8jk said:Adam Starkey[/url]":3ifmz8jk]
I totally agree. The auto industry sending out ringers to magazines to test is pretty much the accepted norm. It's wrong, and if no-one calls them out on it, then there's no incentive for anyone to play fair.
Ok, so it was clever in regards to the Antutu test loop. But the incidence i was referring to was how ICC would ignore agreed upon feature markers for things like SSE, instead focusing on the supposedly cosmetic "Genuine Intel" label. There is one example where ICC compiled code ran 30% faster on a Via CPU that faked the "Genuine Intel" label. That, IMO, makes ICC very much untrustworthy for use with benchmarks or anything similar.[url=http://meincmagazine.com/civis/viewtopic.php?p=25006651#p25006651:3s6rthu1 said:Zarsus[/url]":3s6rthu1][url=http://meincmagazine.com/civis/viewtopic.php?p=25006577#p25006577:3s6rthu1 said:hobgoblin[/url]":3s6rthu1]That could be claimed to be a smart compiler. But then the Intel compiler have in the past produced code that would drop back to 386 instructions if the CPU didn't id itself as "Genuine Intel"...[url=http://meincmagazine.com/civis/viewtopic.php?p=25005497#p25005497:3s6rthu1 said:Musafir_86[/url]":3s6rthu1]-Hmm, so how about what Intel's (allegedly) doing? http://www.eetimes.com/author.asp?secti ... id=1318894 (via http://legitreviews.com/news/15752/).
Regards.
At this point, the only compiler i would trust to produce impartial binary is GCC.
Intel's compiler would degrade performance for non-Intel CPUs, but this doesn't make the compiler bad, it just makes the compiler bad for other CPUs, it would certainly produce valid output. Intel doesn't cheat on this regard, it's a testament to their compiler that it's able to identify useless benchmarking code that ultimately doesn't produce any visible results and removing or re-architect them. For an example you can check out the comment of this StackOverflow question:
Intel Compiler 11 does something miraculous. It interchanges the two loops, thereby hoisting the unpredictable branch to the outer loop. So not only is it immune the mispredictions, it is also twice as fast as whatever VC++ and GCC can generate! In other words, ICC took advantage of the test-loop to defeat the benchmark...
I'm pretty sure the primary goal of sun spider is to make sure every feature or bugfix added to WebKit either makes it faster or has no performance impact.[url=http://meincmagazine.com/civis/viewtopic.php?p=25005409#p25005409:2h377hke said:lowlymarine[/url]":2h377hke]SunSpider these days is a test of how well your JS engine cheats at SunSpider, nothing more.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006805#p25006805:2qvwfw65 said:Violynne[/url]":2qvwfw65]...
Over time, these start to gum up the inner workings. Even as I was reading up on Google Android, one of the most prominent comments I read was my phone *will* degrade over time due to apps.
I'm going to try and stave this issue by reducing the apps I install (so far, only 8), but to me, this seems to be counter-intuitive of using a *smart*phone.
[url=http://meincmagazine.com/civis/viewtopic.php?p=25006959#p25006959:114dkblu said:doppio[/url]":114dkblu][url=http://meincmagazine.com/civis/viewtopic.php?p=25006805#p25006805:114dkblu said:Violynne[/url]":114dkblu]...
Over time, these start to gum up the inner workings. Even as I was reading up on Google Android, one of the most prominent comments I read was my phone *will* degrade over time due to apps.
I'm going to try and stave this issue by reducing the apps I install (so far, only 8), but to me, this seems to be counter-intuitive of using a *smart*phone.
That's not entirely true though; your phone will slow down as long as it is running many apps concurrently, but it will not run any slower if you install an app but don't run it. I have over 60 apps, and my phone is just as responsive as on day 1 (well, it's actually faster, since I removed some of the always-on apps that came with it).
You do need to take care not to clog the phone with autostart and background apps. Not much different from having to clean your house and walk out the thrash occasionally -- the alternative is renting a room in a hotel where someone else cleans up for you, with the added inconvenience that you're never really at home, you lack some amenities, and you pay extra.
OK, so there's more than one possible explanation.[url=http://meincmagazine.com/civis/viewtopic.php?p=25006579#p25006579:218wyopv said:koolraap[/url]":218wyopv]As for who's responsible? It could be the marketing department. It could just be Samsung's Programmer of Legend(tm) who squeezed 10% more out of the phone during one busy coding frenzy. "See boss? It's fast. And it's got the battery life it says on the box."