Google IO 2019 liveblog: everything from the expected Pixel 3a launch

Our Google IO 2019 liveblog is ongoing, as we report on the developer conference that's already nearing the two hour mark. We have a rundown on the news from the keynote below – straight from Mountain View, California.

So far, Google has announced Android Q beta 3 and new features including Dark Theme, Nest Hub Max (and a price drop for the original Google Hub [now renamed Nest Hub]).

We're on the ground today, picked up our badge yesterday and waiting to deliver to you minute-by-minute updates about Android Q, Pixel 3a, and possible updates to Nest and the Google Home devices. 

And, of course, we'll also be the first to report on any other surprises. Sure, you can always check out the Google IO livestream video, but for people at work (supposedly working), this is where you need to stay locked for all the latest live updates.

Google IO liveblog: real-time updates

All times in Pacific Daylight Time

11:45am: We're calling it. Our Google IO liveblog is done and all of the announcements are below.  

11:40am: The last minutes of the Google IO keynote slow down in terms of software and hardware announcements, but go deep into real-world needs where people need a helping hands the most. Google is now talking about how its Neural Networks and physics for forecasting floods. 

If you've come looking for Android Q announcements, Google Pixel 3 features or the Nets Home Max, you can scroll below. 

11:34am: Our Google IO 2019 keynote has breeched one and a half hours and we haven't really slowed down. Google is talking about TensorFlow, Machine Learning, medicine and the Google AI healthcare team. This is usually accompanied by a touching video about improving patients lives with tech-infused care and diagnoses. 

11:27am: The Pixel 3a is coming to more than just Verizon is the US. In addition to Verizon, the entire Pixel series will come to T-Mobile, Sprint and US Cellular. Of course, it'll be unlocked and be compatible with Google Fi because it's unlocked from the Google Store.

As for Pixel 3a specs, it'll have a Snapdragon 670 chipset (as opposed to the 855 chip), Full HD display, 4GB of RAM, 64GB storage and polycarbonate body. Those are the big differences with the Pixel 3. The one constant being that amazing Pixel 3 camera.

11:26am: Google is talking about Google Pixel 3a battery life and is looks to be all-day, like the Pixel 3, suggest 30 hours on a single charge and 7 hours with just 15 minutes of fast charging. Of course, you'll likely get more time out of the Pixel 3a XL than the Pixel 3a.

11:25am: The Google Maps AR feature we saw touted at Google IO last year is finally coming to consumers, and it'll be available first on Pixel phones, according to Google. It's something I want when exiting the subway in New York and I'm trying to figure out which way it uptown and which is downtown. 

11:24am: Google Pixel 3a camera's Night Sight mode is being showed against the iPhone XS, and the differences are stark. The Pixel 3a photos look just as good as the Pixel 3 camera photos and include free backups via Google Photos.

11:23am: It'll have 3.5mm headphone jack, which makes more sense on a budget phone than flagships. (But really, I'd like to see it remain on flagships, too).

11:22am: The Google Pixel 3a and Pixel 3a XL are about to launch. Will this be almost as good as the Google Pixel 3 at almost half the price. They're plastic but look very similar, with colors: Just Black, Nearly White and the new Purple-ish.

11:20am: Here are the Nest Hub Max price details straight from the Google IO keynote. It'll cost $229. and the original Hub (now named Nest Hub) is $129. They'll be in 12 new markets, support nine new languages.

11:19am: Want to feel like you have 'The Force'? The Hub Max will allow you to gesture to pause what's playing, which, as per the demo at least, looks much easier to do that shouting over a loud speaker.

11:17am: Face Match sounds pretty, with personalized recommendations, calendar reminders and even greetings in the morning based on who is standing in front of the 10-inch display. Google is really putting the power of the camera to use with this more advanced version of the Hub.

11:14am:  Google Home Hub is being renamed Nest Hub, and it's being joined by the Next Hub Max. It's a new product with a camera and a 10-inch display, to be a centerpiece of your home. You can use it like a Nest Cam, and check on things in your home via the Nest app. Its wide angle lens gives you a good field of view. For privacy, there's a green camera recording light and an on-off switch for the camera on the back.

11:13am: On to Google Home with more talk about… you guess it: Google Assistant. It's the software that Google is pushing out to every device. It's also talking about respective your privacy in bold bold letters. It means it everybody!

11:12am: No Android Q release date (although we're fairly certain it'll be in early August as always) and no official name (your guess is as good as ours).

11:10am: Android Q beta 3 is available on 21 devices, including 12 OEMS. We're uploading a photo of all of the third-party company logos right now. This includes OnePlus and Nokia and definitely an improvement from last year's seven beta participants.

11:08am: Google's just announced focus mode. It's like an advanced Do Not Disturb mode and it's coming to Android P and Q devices 'this fall' so likely around August or September.

11:07am: Digital Wellbeing and privacy are what Google is talking about right now. You'll have more control over your location (when you order pizza, you can allow the app to know your location, but it won't follow you along indefinitely. 

11:02am: Dark Theme is coming to Android Q. It's officially among the more than 50 features of Android Q. It'll burn less pixels (on OLED screens). It can be access from the notification shade (shade has a whole new meaning) via Dark Theme or the Battery Saver mode. You heard it here first on our Google IO 2019 liveblog.

11:01am: Here's another Live Caption demo, this time the twist is that Google's speech recognition technology can be used offline (the demo was in airplane mode). On-device machine learning happens protects user privacy, too.

10:59am: We knew about this Android Q feature ahead of our Google IO liveblog, but Continuity is coming to prepare for foldables. When you transfer from the folded state to an unfolded state, the app you're using adjusts seamlessly. Samsung build this into its Android Pie phone, but it'll come pre-packaged with Android Q.

10:57am: Android Q is next, and Google announcing that there are over 2.5 billion active Android devices – from 180 device makers around the world. And now foldable phones are coming to Android OEMs.

10:55am: Live Transcribe, Live Caption, Live Relay and Project Euphonia are among the very advanced disability features Google is working on in 2019.

10:52am: Live Relay is a similar on-device feature that lets you talk on the phone with someone while they read a text.

10:51am: “It's such a simple feature, but has such a big impact on me,” said a person who is hearing impaired in a Google video. The use cases for the Live Caption feature are groundbreaking for the deaf. And for those who can hear but are in a subway, for example, it has a use case, too, says Google. 

10:49am: Live Caption is new – with one click you can turn on captions for a web video, a podcast, or even a video you record at home.

10:45am: Federated Learning is what Google is using to anonymous data to improve the global model. Here's an example: Take Gboard, Google keyboard. When new words become popular – as people are typing in BTS or YOLO – after thousands of people type that in (or millions in the case of BTS), Gboard will be able to harness this data without tapping into individual user privacy. It's sounds a lot like Apple's 'Differential Privacy' approach.

10:43am: Incognito Mode is coming to Google Maps (so it'll be on Chrome, YouTube and Maps in 2019), and one-tap access to your Google account, Chrome, search, YouTube, and Maps.

10:40am: Google is talking about privacy and consumer control Incognito in Chrome is 10 years old, and Google Takeout is a valuable service for exporting data. Google says it knows that its work isn't done. It's making all privacy settings easy to access from your profile. You can view and manage your recent activity and even change your privacy settings. This goes along with auto-delete tracking controls (3 months and 18 months) that it announced last week. It's rolling out in the coming weeks.

10:36am: Breaking news: Now you can ask a Google Home speaker to turn off an alarm without having to say “Hey Google” first. Just shout “Stop” and the annoying alarm will shut off. So helpful. It's coming to English-speaking locales starting today, according to Google, so look for it soon.

10:35am: Driving Mode will be available this summer on any Android phone with Google Assistant. That means on over one billion devices on over 80 countries. Google's mission is to build the fastest most personal way to get things done. 

10:33am: Google Assistant is coming to Waze in the coming weeks, and there's going to be a driving mode on the Assistant app, bringing personalized suggestions and shortcuts (like to dinner directions, top contacts, or a podcast you want to resume in the car). Phones calls and music appear in a low-profile way so you can get things done without leaving the navigation screen.

10:32am: Google is expanding the Assistant's ability with “personal references.” It'll understand “Hey, Google, what's the weather like at Mom's house this weekend.” You're always in control of what it knows about you, your family and your personal information. Google proposes this will be very helpful on the road with Android Auto.

10:30am: The next-generation of Google Assistant to newer Pixel phones later this year.

10:29am: Google showed off a more complex speech-to-text scenario in which the Assistant could send an email – and tell the difference between completing an action (opening an email and sending it), and receiving a dictation. Right now, Google Assistant can't do that and requires tapping the screen. The future is touchless.

10:27am: Now onto a more practical (and speedy) demo: sending messages, looking up a photo to send to a friend, and replying with that photo. It's all done without the need to touch, and includes multi-tasking.

10:26am: So far, the speed of Assistant seems to have improved a bit. But this is just how we've imagined it should work. It's getting a lot of applause from people in the crowd (who probably don't find Google Assistant to be instant).

10:24am: Time for the next-generation of Google Assistant. Bold vision: What if the Assistant was so far that tapping to operate your phone almost seemed slow. Google wants it to be 10x faster.

10:22am: Importantly, Duplex on the web doesn't require input from businesses. It'll work automatically, according to the CEO of Google. It's the company's way of building a more helpful Assistant.

10:21am: Google is tackling painstakingly slow online reservations. “Book a National Car rental for my trip” and Google will start filling out the details for you. It's acting on your behalf, although you're always in control of the flow: make, model, color, whether or not you want a car seat. It's a lot less input and selection, and more modify where you need it.

10:20am: Sundar is back on stage talking about last year's big surprise: Google Duplex. It's Google's AI voice assistant that calls restaurants for reservations. It's now extending Duplex for tasks on the web.

10:20am: The updates for Google lens will roll out later this month, so you should see them by the end of May.

10:17am: At Google IO 2019, Google Lens is getting new language translation functions. If you see a sign in a foreign language, it'll translate and even read the words aloud. The coolest part, it'll highlight the foreign words at it pronounces them, so you can follow along (and maybe learn a bit).

10:16am: Wondering what a dish even looks like? Google Lens will be able to pop up a picture based on the words seen on a menu. Not word on how it gets this picture (whether it's based on actual photos from the restaurant or what the general dish looks like).

10:14am: Another Google Lens example: Point your phone camera at a menu in a restaurant, and it'll point out the popular dishes at the restaurant and tell you what other people are saying about it on Google Maps (seems to be where it's sourced). Lens can help you pay for the meal, even going as far as helping to calculate the tip.

10:13am: People have used lens more than a billion times so far. Indexing the physical world, much like search indexes the billions of pages on the web.

10:11am: There's a 3D shark on the Google IO 2019 stage. How? The AR shark with layers of teeth (and the audience behind it) came from the Google search and clicking on a 3D model. Using a phone camera, it was placed in the real world, merging 3D objects and your world.

10:09am:  The camera is coming to Google Search. Search for muscle flexion. You can see 3D models in search results and place it in your our apartment.

10:08am: Google wants to “surfacing the right information in the right context.” This may be code for “We're doing our job to defeat fake news and hoaxes.” 

10:05am: Sundar says Our goal is build a more helpful Google for everyone. And when we say helpful, we mean giving you the tools to improve your knowledge success, health and happiness.” So far, it's been a re-cap of what Google does now. We're still waiting for what it's doing next.

10:03am: Google is using AR in its official Google IO 2019 app to help users better navigating this outdoor developer conference.

10:03am: Sundar just took the stage. “I would like to say welcome to the language that all of our users speak, but we want to keep this keynote under two hours.”

10:00am: Google is starting with a video, with a retrospective of various technology, from the original cell phone to the N64 controller and what looks to be Star Trek with Leonard Nimoy. 

9:59am: Last-minute before go time, and my teammates are contemplating live blogging about my live blogging. “He looks serious and stressed. Oh, wait, he just crinkled his forehead.” Yes, I'm in game mode.

9:57am: Predictions on the first thing that will be announced at Google IO 2019: lots of number about the success of the company and how developers are gravitating toward coding for Android.

9:50am: We're 10 minutes from the Google IO keynote and here's your team on the ground (left to right): Matt Swider, David Lumb and Nick Pino. We're ready to ace this liveblog with real-time updates of everything announced.

9:29am: I recall two human DJs at the last Google IO I was at (left). With Google's AI DJ (right), I wonder if the human DJs are  still getting gigs in this increasingly autonomous economy.

9:27am: Google's AI DJ had to reset, because while the future is autonomous, it's not yet perfect. It's up and spinning again with what the kids would call sick beats.

9:22am: On the Google IO keynote stage, Google is has an AI DJ playing music accompanied by a human DJ (mainly to put the record on the turntable). It's automatically adjusting the tempo.

9:00am: We're in our seats for Google IO 2019 and just one hour from the developer conference. We're ready to tackle the keynote.

8:30am: Here's the new Google IO signage for 2019.

May 7, 1:09am: We've wrapped up our Google IO 2019 planning, and updating the liveblog one last time. Expect early morning updates soon.

Yesterday, 3pm PT: I have my Google IO badge. David Lumb and Nick Pino are also joining me to provide liveblog commentary. Only a few hours left before the keynote starts.

Image credit: TechRadar

10:30am PT: Why is it that the Google Maps estimate for ridesharing apps is always why off (especially for Lyft?). Maybe this is something the company can fix at Google IO (in addition to quick Android updates and Messaging).

Image credit: TechRadar

10am PT: We're 24 hours from Google IO and have successfully made it from New York City to Mountain View, California (with a flight to nearby San Fransisco).

You can really see the difference in weather here in the Bay Area:

Image credit: TechRadar

Refresh for more Google IO 2019 as the event begins at 10am PT.

No comments yet.

Leave a Reply

in development