My sister-in-law is a high school teacher and has been wishing she could hook her iPad up to the projector since the day she got it.
Think of all the possibilities...makes so much more sense for a school to spend $500 on an ultralight, ultraportable iPad than spend more on bulky desktops you have to cart around just to play a video or show a web article (citation: my old high school).
There's not much more this feature lets me do on the app front, but man I'm excited to see what people do with it.
I assume the iPad 2 hasn't lost the original's capability of outputting VGA.
Edit: the reason it's a big deal is that the original iPad had to have the app specifically coded to deal with the external screen/projector (and few are). The new version just lets you mirror the content, whatever app you're in. It'd be nice if the new iOS retroactively made this possible on the original (at least at same resolution?) but I kind of doubt it. We'll see next week.
Yeah, I read that bit. I guess I wasn't clear: it means the iPad, iPhone4 and iPod Touch 4g have had the ability to output HDMI all along. It's pretty weird they'd keep that quiet IMO.
Yup sorry, from the iPad 2 site: "Just plug in the Apple Digital AV Adapter or Apple VGA Adapter (sold separately) and your HDTV or projector becomes a bigger version of your iPad."
I didn't see anything non-HDMI mentioned in the liveblog so I wasn't sure if Apple made it work retroactively. Glad that they did!
The mirroring doesn't work. Having forced my iPad to do it via jailbreaking, it's clear the A4 chip isn't fast enough to do it for apps with any video or animation at all.
a) Apple is shipping, and loudly promoting, a feature that "doesn't work" because of a fundamental, immutable hardware deficiency.
b) The software implementation of an unsupported hack, used by an infinitesimal number of users, written by someone who is unpaid and not intimately familiar with the hardware stack and firmware, is inefficient.
Hopefully this will put a dent in SMART's Sympodium series, which in my experience is vastly overpriced for the way it's used. Three grand will buy you a pen-screen monitor that is wired in-between the computer and projector.
Because of the cost, most places will not buy dedicated podiums: ours had to be carted around and wired into the computer setup for each professor.
An iPad is much, much more easy to deal with (everything's one unit) and there's a likelihood that the professor may have one of her own, meaning the tech doesn't need to set up what's needed.
Well the reason they convince everyone that it's going to be big is because of how they repackage it. That has been the case from the iPod on and again here with HDMI. As others have mentioned, I don't need anything more than just my iPad for HDMI presentations now.
Here's hoping the same support comes to the iPhone 5. As an app developer one persistently annoying thing is my inability to demo an app to an audience in anything but the Simulator.
I've been working on apps related to camera functionality, and the inability to show these apps on a projector is shitty. If I can mirror the display I'd be exceedingly happy.
They only mention charging through the 30-pin port on the HDMI dongle. I wonder if any other 30-pin accessories will work when the HDMI mirroring dongle is attached?
I can feel you on the stealing aspect (though with the optional free engraving you can really limit the resell value), but they are not easier to damage than desktops. There are no moving parts. Unless you drop it in a sink, you're going to be OK (that would be ill advised with a regular PC too). The software is also incredibly hard to screw up, which lowers IT support costs.
It truly amazes me that a company can not only design, manufacture and develop software like this in near 100% secrecy, but that they can also have their website completely updated instantly after the keynote with beautiful graphics and videos. How they do it and keep it so under wraps astounds me.
1. Take steps that make it easier to identify leakers (like embedding secret numbers into promotional photos so they can tell who leaked them if they get published)
and 2. They pursue anyone who dares leak information with the ferocity of a rabid dog.
While I don't disagree with your observations, I think it would be remiss to exclude the fact that many Apple employees are incredibly loyal and believe in the company.
When I started working for Apple (about 5 years ago now) it was my dream job (I had serious fanboitis). The new hires got a security orientation which was your basic "We spend lots of money protecting things; don't do anything lame."
But the reality of working makes the magic unicorn dust wear off. I saw a bunch of prototypes and I knew a bunch of information that nobody else 'knew' but lots of outsiders suspected. I became more interested in my work than selling other people this idea that I had these secrets that I could pass onto others. (We were also told to 'eat our own dog-food' which makes me highly suspicious of every Software Update to this day. I always wait a couple weeks before installing anything new.)
That's an important part of the equation. If your employees are more interested in leaking information than working, maybe you're doing something wrong. (Just to clarify, I still consider Apple a great place to work; I left because I wanted to do something different, not because I felt that it was a poor place to work.)
Surely if you were encouraged to 'eat your own dog food', that would mean people inside apple would be testing stuff themselves, so would mean eventual public releases would be better?
You might think so. It would be hard to test this empirically since I would need an alternate universe where people inside apple did not test unreleased software on themselves before releasing. But seeing the process that goes into deciding what fixes go in and what fixes don't has educated me to NOT update unless I have a real need to. I saw lots of examples where a fix for something broke something else that was seemingly unrelated. Or that a feature that I liked depended on a bug that was going to be fixed. That's why I wait on Software Updates. And if I don't need the Update, I don't install it.
The problem with 'eating your own dog food' is that it is dog food, so it's really hard to eat. To comply with this edict, I would set up a separate machine and install the latest build and play around with it for about 15 minutes and spend a half hour (or more) filing bugs. Thus, I wasn't really eating my own dog food...
Installing new builds turned out to be a big time-sink. It took time for the build to install and then you had to go through the paces of setting the machine up. You also had to do a clean install because of incompatibilities with previous releases.
After all of this, you started testing. Then you started finding bugs, and since I was in engineering, I tried to make sure the bug was reproducible and even tried to track down which component was responsible (which is a mess if you're dealing with overlapping stacks of software). THEN, you had to crawl through the maze of reported bugs to see if there was something already reported that was close... Hopefully they've made the bug reporting system better since I was there (but I doubt it… the system I used had been in place for a long time).
On top of that, I needed to get actual work done. There was no way that I was truly going to use a ever changing system of dubious quality to do work. I tried, and it was too difficult (although I knew of some folks that were faithful and did so).
'Eating your own dog-food' is an ok idea. It's not clear to me that it was the best idea for this particular situation. As a general practice, it seems that if your particular product scope is small enough, this would be a good idea. If it's larger, hiring a set of dedicated test engineers would be awesome. (The ironic thing is that Apple had those too!)
True but every company has mostly loyal employees (imho). The trick is keeping the dishonest ones in-line and the best way to do that is through fear. Apple makes it clear they'll go nuclear on anyone who leaks information and the dishonest ones realize it probably isn't worth the risk.
I disagree. In a company with 10s of thousands of employees it still only takes 1 to leak something. Loyalty is (IMHO) far more important than those other factors (although they too contribute).
Consider a more open example: Google. Internally Google is more open than I believe Apple to be (I have experience with the former, none with the latter other than what I've read).
You may think that a lot of stuff leaks from Google based just on TC stories and so forth and while there are things I'm sure we could manage better, when I joined I was (and still am) amazed at what doesn't leak.
Google doesn't have the same militant, even draconian, approaches to secrecy that Apple does. What they have in common is that most people in both companies (IMHO) are very loyal.
I completely agree. Their website roll out team kicks ass! They should publish how they manage the 'browser caching' issue, whenever I go to Apple.com after/during keynote, I get the latest page without doing hard refresh.
I don't disagree they they must have their act together to support such big online launches. But for cache busting, isn't it just a simply matter of renaming resources and having your framework + deploy scripts manage the rest for you?
most sites use assets with additional query parameters added to keep them fresh (i.e. file?1234566778.js). Apple looks like they are adding some parameters into the filenames themselves.
It probably also involves cache-control settings as mentioned, but I think file renaming is the more reliable method of cache busting.
I'm really not. max-age and the Expires header, properly set, will control caching at just about every possible step. If you refresh apple.com several times, you'll notice the max-age cycles down with each refresh, and the Expires header stays the same. They're obviously set up to cause the page to have a short maximum lifetime in both public and private caches, and in the case of a launch, it's not that hard to make sure that the old copy of the page expires at 11 AM PST or whenever you want your announcement to go live.
Announce product, push deploy button on website, people hit website and browser requests a new copy of the page because its current copy expired at 10:59:59, and presto, it all magically works.
But that only solves for the case of predictable events. If you want to remain agile, and do deployments on an as-needed basis, then you need to cache bust with file naming or query parameters as I mention in other msg. Again, I'm pretty sure this is the preferred method of most web teams, and also appears to be what Apple is doing (atleast with a subset of their assets).
I seem to recall the last set of updates before this speech didn't work perfectly. People were seeing the results of mixed stylesheets and had to do forced refreshes to clear it.
Love how Apple up-sells storage in hundred dollar increments. I just bought a 32GB SD card (class 10) for my camera for $50 and I'm not buying millions of them at a time.
If you want to participate in a market where things are priced according to what they cost the vendor, look into gypsum. In tech companies, cost-based pricing is good mostly for getting you fired from your Product Manager role.
Tech products are priced according to the market. For many products, Apple competes aggressively on price. What that means is that within the product line, Apple wants there to be a SKU with the best pricing among its substitutes. It may also mean that every product in the line "wants" a price that is competitive according to its specs. But neither of those pricing goals means Apple will price anything according to the cost of components.
Finally, and perhaps most importantly, in complex markets one of the most important jobs marketing has is to segment the market, which means finding product attributes that command different price points to maximize profit for the company. It may be the case that, regardless of what it costs Apple to add storage to the iPad, the most common kind of customer that wants a large-storage iPad isn't very price sensitive.
This works in both directions. You can play the segmentation game in exactly the opposite direction in enterprise sales: release your product as open source, but sell a supported site-licensed edition. In virtually all Fortune 1000 companies, it is next-to-impossible to embark on any new IT initiative without an accompanying purchase order; the whole of what IT does is structured around selecting, procuring, and maintaining products.
Not all flash is the same. SD cards and USB sticks use cheap, slow, high-density flash manufactured with older processes. The A4/A5 flash is also integrated into the same package as the CPU(s). Most analysts estimate the flash as a significant part of the overall cost of iOS devices. Yes, it is market segmentation as well, but that extra flash still isn't free.
> The A4/A5 flash is also integrated into the same package as the CPU(s)
Really? That would be by far the most aggressive die stacking I've ever heard of, if true. I thought it was soldered to the motherboard in stacks of 2-4 dice.
See step 16 of iFixit's iPad teardown. Looks like both ram and flash are separate assemblies. Only the iPhone has the ram integrated with the CPU package.
To support my original point, though, iSuppli's breakdown of the iPhone 4 last summer had the 16 GB flash chip as the second-most expensive component, after the retina display, at ~15% of the total outlay on parts.
For "power users", the Xoom should be a clear winner, no? More free customization, tabbed browsing (and plenty of RAM to handle it nicely), and most importantly, it is free to develop for the Xoom from any OS.
As far as cheaper, maybe the Nook Color would be a smarter buy for the "penny-pincher".
Or, with the actual pricing compared to the competition, think of it this way:
The price model is competitive enough for the price to sink in hundred dollar decrements, although a 32GB flash memory chip is only $50. Maybe that's something you only can do if you expect to sell millions of them?
The rip off level pricing on storage and the obvious, purposeful omission of a card slot is one thing that really turns me off from buying an iPad or iPhone.
From the article:
" Yes, we can all agree that the displays are seriously cruddy and the overall build quality and design, for lack of a better work, suck, but at this point we've just come to expect that for a tablet under $300. "
"Yes, we can all agree that the displays are seriously cruddy and the overall build quality and design, for lack of a better work, suck, but at this point we've just come to expect that for a tablet under $300."
That looks like trash compared to the iPad. We've seen in the market so far that building a tablet that is remotely close to iPad in terms of hardware build, let alone software, costs more.
Not 'weird'. That is unrelated and dependent upon other factors.
I'm not referring solely to the iPad. Apple charges an extra $100 for 16GB of flash storage for iPhones and iPods. That's not market rate.
The whole idea of pricing storage like that came from the iPod, and I think by the time you have devices that are used for general computing like iPads, it's time to drop it. I would prefer to have a card slot.
(By the way, for people who may be downvoting my earlier comment because they think I'm some crazed anti-Apple partisan, I do own a MacBook Pro and an iPod. A client is also buying me an iPad 2 for completing a project, so... that's just my opinion about the storage pricing).
I am not sure whether I would want one. Having one would introduce a lot of opportunities, but also would remove lots of ease of use. Install an app => user must select where to install it. User interface-wise, what would happen to the home screen if I take out a card with some applications on it? What if I plug in another card? What if that card contains an application that already is on the built-in flash? Will this support separating executable from user settings? Etc.
The best solution for this I know of is the one Microsoft chose for their phones: no built-in flash, and the card slot is in a place where it cannot be removed without shutting down the hardware. However, I am not sure that solution adds sufficient value to the device to warrant its downsides (some compartment door that makes the device slightly larger, and that likely makes the device look a little bit less nice)
I have a palm pilot that used an SD slot and I thought they handled that very well. When you installed an app it installed to main memory. You could move it to the SD if you wanted to. And when you ejected the sd, the icons for the apps that were on the SD card on the home screen disappeared. They came back when you put the card back in.
I'm not saying I want that back, but I am saying that I thought it was a pretty nice easy way of making that work.
It can, but for some reason (memory leaks?) the iPad slowly loses that capability. I find occasionally powering it off brings back the ability to load multiple tabs without reloading pages.
They don't advertise the available memory of any of the iOS devices, not even to their Developers. The community finds out on their own (usually after teardowns or writing test apps).
FYI: If you are thinking about trading up from the iPad 1, gazelle.com is offering $320 for the 16GB WiFi version. This price won't stay for long since the iPad 1 16GB WiFi is going for $399 on apple.com. Also you can use the code "TWiT" for 10% more.
I don't have any personal experience with it (yet), but eBay's new "Instant Sale" appears to be offering upto $370 for a mint 16GB WiFi version with all the original bits.
I've been meaning to buy one of these for my parents for some time now. They rarely do anything other than email, surf the web and photos. The improvements (especially the speed bump and cameras) finally give me a 'features' justification for buying it for them.
Looking at the sales page, there has been de-emphasis of iBooks, apps, games, and additional emphasis on ease of hands-off use for watching videos, AppleTV. For instance, the angled cover, using the iPad as a projector.
Either this is turning out to be what people are actually using the iPads for, or Apple is actively steering people towards using iPads. Maybe they don't want to cannibalize the sale of Macs.
looking at the keynote live blog that engadget did, it seemed like most of the focus was on how iBooks was doing (the lead), and creation software - iMovie and some sort of garage band / rock band hybrid.
Thanks. That's a learning moment for me. Apple is very clever about speaking to their audience. The people who'd listen to Jobs live are content creators and avid readers. The average visitor just want to watch TV.
- iPhone 5G will use this dual-core A5 chip
- there'll be a smaller, low-cost iPhone model (nano iPhone?)
- the next nano will play video (maybe combined with the above?)
- iPad 3G will play 1080p, and have 1GB RAM
These devices have enough power for a full OS. But I predict Apple will hold off combining them (OSX on iPad; A5 in laptop/desktop) for as long as possible: 1. it doesn't help the customers of the iPad; 2. it would segment the developer market, who can use the extra power anyway. However, it's inevitable as processors get faster; and they can't hold off for ever because if they don't do it, someone else will. It's just a question of when.
It represents the end-game of the smartphone disruption of laptops and PCs, because you will "dock" it at home and work, yet still have all your data with you whereever you go (despite coverage, network outages, webapp provider downtime etc etc etc) like people used to do with their laptops. A bit like the portable diskdrive that the iPod was.
You might think Apple would be hesitant to do this, because it will cannibalize their own laptops and desktops. I think they will do it as soon as they can, because: they have a history of racing down the tech curve as fast as possible; if they don't do it and it's possible to do, someone else will, and the strategic losses from not being first are enormous; it will only start to cannibalize their other products (because a smartphone really isn't as good yet; people are familiar with the old way; power users are a long way off being disrupted).
I've been waiting for this. Ye Mark my Words, the End is Nigh for the PC: July 2011
It's clear to me you don't understand the market then. Being the thinnest is a clear differentiator. Having aesthetic style means a LOT to a lot of people (ie, customers).
Furthermore, Apple did increase the processor size, memory, and probably battery (considering it has same 10hr life) and also decreased the total volume of the enclosure.
Finally, if you've ever used an iPad at length you'd realize it doesn't need cooling 99.9% of the time. It just doesn't get warm... this is a huge differentiator from the netbooks it's often compared to.
Apple even like to "cheats" the thinness, by making the edges thinner than the rest, so it seems thinner in photos and is just hovering over the desktop.
The thin profile is actually what amazed me reading the specs. I have a new-ish iPod Touch and the form factor is one-of-a-kind. Even the iPhone is thicker than the Touch, with identical processing capabilities. Something like an iPad being about as thick as the Touch makes me more excited about buying one because it seems so darn futuristic.
If you look at the "TV and Video" section, it explicitly says it supports 1080p video out. But, if you look at the "formats supported" sub-section, it only supports 720p video.
That's simple to explain. The graphics subsystem easily supports outputting 1080p video. However, the iPad 2 has the same screen resolution as the original iPad, 1024x768. This means that the device itself only supports up to 720p, since 1080p is 1920x1080.
720p is 1280*720; so the device can't display 720p. I think what the spec there is talking about is that it can decode video up to that resolution - it's more of a codec performance number.
Nice little incremental update. I think they did a pretty good job addressing some of the issues people had with the original model -- at least within the realm of what is technically possible while meeting the same price point.
The rows with specs that don't differ between the two versions span two columns. Look at the backgrounds. It's true for Display, Chip, Cameras, and Sensors.
I don't have an iPad and was waiting for this one hoping for the hi-rez screen. I might pull the trigger on it anyway because I'm not that patient, but it certainly would've been nice.
My sister-in-law is a high school teacher and has been wishing she could hook her iPad up to the projector since the day she got it.
Think of all the possibilities...makes so much more sense for a school to spend $500 on an ultralight, ultraportable iPad than spend more on bulky desktops you have to cart around just to play a video or show a web article (citation: my old high school).
There's not much more this feature lets me do on the app front, but man I'm excited to see what people do with it.