Explainers

The importance of artificial intelligence in smartphones

Is this still the future of technology?

Published

on

Have you ever wondered what smartphone brands actually mean when they tell you that their cameras use artificial intelligence (AI)?

With AI now becoming a significant part of our daily lives, we start to look into how this technology found its way into the market, and see whether or not AI truly is the future.

What is Artificial Intelligence?

Artificial intelligence, or AI for short, is a not-so-fairly new concept in the world of technology. What it basically means is that machines are given human-like intelligence through a system of information and programs or applications that are built into machines.

Machines with AI built inside can perform a variety of tasks mostly observed through human intuition like problem solving, gathering knowledge, and logical reasoning — among others. It’s basically making machines smarter and, in a way, more human-like.

Illustrations by Kimchi Lee

AI has been a part of many devices over the past few years, from smart homes to applications on your smartphone. Companies like Amazon and Google have come up with smart home devices that assist people with their day-to-day tasks such as Alexa and Google Assistant.

Businesses with online presence through company websites have also integrated chat boxes and online assistance bots that automatically answer any customer concerns depending on the information given.

How AI found its way to smartphones

Artificial intelligence was often associated with creating robots to perform human-like functions at a much faster, more efficient rate — which is heavily portrayed on mainstream media. Through AI, these machines learn more about the environment they’re in, and carefully adjust to meet the needs of the users. Such a process is called machine learning.

Nowadays, machine learning isn’t just limited to AI robots that learn what people are doing, but has now branched out to what people are thinking, inquiring about, and saying to other people. AI has slowly made its way into other devices that are much more accessible to us, primarily through the internet.

Machine learning is now incorporated into smart home devices, online video streaming websites like YouTube and Netflix, social media websites such as Facebook and Twitter; basically, the technology behind AI constantly learns more about people, their interests, and day-to-day activities.

The newest member of AI-integrated devices are smartphones themselves. Companies like Apple and Google have looked into integrating AI into the processors of their flagship phones — the iPhone and Pixel series, respectively. Early 2018 saw most Android smartphone brands integrate AI within their phones as a way of enhancing the user experience even further; Huawei and ASUS released their new flagship phone lines with their cameras utilizing AI for smarter responses to the environment around the user.

It’s quite possible that smartphones could very well lead the transition of all devices towards machine learning and AI in the near future.

Smartphones with AI

As mentioned, two companies have integrated AI into their smartphones to provide enhanced user experiences in a totally different way. One of these companies is ASUS, with their recently released ZenFone 5 series of smartphones with cameras powered by AI. Its shooters focus primarily on taking better photos and adjusting to the environment around you. The ZenFone 5’s AI Photo Learning allows the phone to learn how you like your photos and adjust the settings accordingly so you don’t have to.

Apart from its cameras, the ZenFone 5 series uses AI to boost overall performance. The base model is powered by a Qualcomm Snapdragon 636 processor, which enables the full utilization of AI features on the phone. The AI Boost technology allows the handset to have an instant hit in performance when running heavy-duty applications and games. Of course, AI in the ZenFone 5 also predicts which apps you will use next and learns which apps you use regularly.

Another company that integrates AI in its smartphones is Huawei, with the Mate 10 and P20 series. They’re powered by the Kirin 970 processor — which boosts overall performance and efficiency using integrated AI. This means that the phones will adjust to how much you use them and maximize performance every step of the way. They also come with Huawei’s EMUI 8.0 with its own set of AI features such as Smart Screen for multitasking and real-time translation during calls.

Much like the ZenFone 5, the Huawei Mate 10 and P20 phones also have cameras powered by AI. This powers the phones’ dual-lens camera setups for scene and object recognition, automatically adjusting the camera’s settings to suit the situation. Huawei also emphasizes producing professional-grade photos by allowing the AI to adjust the camera’s focus on the subject. That way, you are able to achieve a perfect-looking selfie or portrait — without the need to manually adjust the settings for a long period of time.

What we get from AI

Artificial intelligence opens up many opportunities for technology to be like humans in terms of processing thoughts and insights. What AI does is it allows machines to learn more about humans and tailor-fits its processes and capabilities to match us, from search engines to smarter applications. When treated properly, AI can actually deliver better and more efficient ways of dealing with the problems people face almost every single day.

The only downside is AI has the potential to even invade one’s privacy, especially through one’s smartphone. Because the technology is constantly learning more about its user through his or her devices, this opens the door for the data to be retrieved by, quite literally, anyone on the internet.

Because people nowadays access their smartphones almost every chance they get, people who truly know how AI works have the potential to abuse what they know and use it for their own personal gain, either through malicious activities like cyberstalking and cyberbullying, or online attacks like hacking or phishing.

The future of AI

2018 is looking like the year of AI with the unveiling of smartphones and revamped smart devices to upgrade the user experience. The possibilities for artificial intelligence are endless, given its wide usage across any available platform.

For now, it’s intelligent cameras on your smartphones that adjust settings for you to save the hassle of getting the perfect image. Some time in the future, AI could very well exist even on a gaming controller or mirrorless camera to adjust to your needs. However, we have to be aware about the dangers of using AI to its fullest as it can also lead to our own careless actions.

Indeed, the future is bright for artificial intelligence — as long as we use it for the right reasons.

Explainers

The secrets behind iPhone 13’s Cinematic Mode

Together with Apple’s VP for iPhone Product Marketing as well as their Human Interface Designer

Published

on

For the first time ever, we had a three-way interview with Apple’s VP for iPhone Product Marketing, Kaiann Drance as well as one of their leading Human Interface Designers, Johnnie Manzari. If you’re not starstruck enough, both of them appeared in Apple’s September 2021 Keynote event!

Other than new camera sensors, newer camera features are also found on the new iPhone 13 Series. One of those is the new Cinematic Mode.

If you’ve watched some of our latest iPhone videos including the Sierra Blue iPhone 12 Pro Max unboxing, we’ve let you take a sneak peek on that new video mode.

We’re not gonna lie, it’s one amazing camera feature Apple has managed to deliver.

But what are the secrets behind it? And are you curious how technicalities work?

Watch our 16-minute interview with the Apple executives explaining why Cinematic Mode is the next big thing in mobile videography.

 

Continue Reading

Apps

How Google alerted the Philippines during the July earthquake

Crowd-sourcing data

Published

on

Illustrations by Kris Blanco

Back in July, an earthquake rocked Metro Manila. Unbeknownst to most but noticed by some, a globally renowned company was helping everyone through the natural incident: Google. In the few minutes leading up to and during the 6.7 magnitude earthquake, Android users received important alerts warning them of the ongoing tremors. Though it wasn’t the dreaded Big One, the alert afforded attentive users a few precious seconds to either seek appropriate cover or stop doing dangerous tasks.

Incidentally, the tech surrounding Google’s earthquake alert system wasn’t just hastily built on ongoing databases or social media. Google actually packed in a fully responsive earthquake sensor for Android phones.

Faster than an earthquake

The forever-increasing speed of technology has always been a contentious element since the rise of smartphones. Developers and users alike have wondered how accurate or quick our favorite devices can warn us of things happening around us. There’s even an XKCD comic about how Twitter can warn us of an earthquake minutes before it reaches the reader.

Over the years, technology has developed new ways to deliver alerts. From simple weather apps to city-wide messaging systems, users can receive warnings in a timely fashion. Practically nothing is a surprise anymore with the right technology.

That said, Google has successfully developed a new system that can rely on other Android smartphones to accurately tell whether or not an earthquake is happening.

A quake detector in your pocket

Speaking to Android Police, the feature’s lead engineer Marc Stogaitis described how Google’s earthquake sensor leveraged other devices to tell users about the quake. It all revolves around the different sensors built inside your phone.

As it is, every smartphone comes with a host of sensors to support its different functions. A light detector can seamlessly adjust brightness and camera settings, and a gyroscope can support compasses, for example. With earthquakes, the biggest element to ponder on is a smartphone’s movement and vibrations during an earthquake.

According to the lead engineer, figuring out the metrics for detecting an earthquake wasn’t a problem. After decades of accurate seismograph technology, developers already have an idea on what they need to measure.

However, the technology does not stop there. Naturally, there are hiccups to relying on just a single (or even every) phone’s data. For one, a city-wide messaging system can set off everyone’s phone in a single area, potentially causing false positives. Plus, relying on a single phone is definitely tricky. There are multiple actions which can cause vibrations akin to an earthquake.

Crowdsourcing a quake

The feature doesn’t rely on just one phone. It doesn’t tap into every Android phone in an area either. Instead, it collates data from phones plugged into a charger. Naturally, a plugged-in phone is the most reliable barometer in terms of battery reliability. They won’t die out in the middle of an earthquake and ruin a source of data. Additionally, charging phones are often stationary. They won’t be affected by motions that mimic earthquakes.

Google “listens” to charging devices in an area. If the subset meets the criteria for an earthquake, the company quickly determines the earthquake’s epicenter (based on approximate location) and magnitude. Once the system declares that a quake is indeed happening, it sends out an alert to nearby devices and gives them the time needed to seek shelter.

The alerts naturally prioritize people nearer to the epicenter. But, of course, the speed will ultimately depend on the phone’s connectivity. A phone hooked up to a building’s fast Wi-Fi connection will receive alerts faster than a commuter’s phone on data while going through a tunnel.

Still, the short time that the alerts give users is enough to save themselves from a precarious situation. Though the feature can potentially warn users of quakes minutes in advance, Stogaitis says that it will more realistically push alerts five to ten seconds before the incident. However, five seconds is enough to go under a table and have some sort of protection against falling debris.

Still keeping things private

For anyone worrying about how Google is handling their data, Stogaitis says that the company removes all identifiers from the data except for approximate location. And, despite that, Google still maintains that the feature will be the most accurate that it can be. Either way, the feature will be useful for any earthquakes in the future.

The earthquake sensor is available for any Android phone running Lollipop and above. Naturally, the feature still necessitates that users turn on emergency alerts on their phone.

Continue Reading

Explainers

The industry’s next big thing: Cloud gaming explained

It’s gaming on the go, but for internet that’s not slow

Published

on

Everybody’s getting into gaming these days, and you can’t blame them. With the pandemic continuing its ravaging ways in the world, people turn to their consoles or PCs for some action. However, not everyone can afford all the expensive PCs and the next-gen consoles when they come out.

Instead, a new player comes into the fray with a pretty great idea. What would happen if you can just play your favorite games from any device? Also, what if we told you that this won’t take up space on your device at all? This is basically what cloud gaming offers to you: a way to play games from any device at any time!

So, how does that actually work? What do you need to ensure quality gameplay, and should you even consider it?

The basics of playing on a cloud

On paper, it’s pretty easy to understand how cloud gaming works. Basically, you have access to a library of games from a cloud storage service. When you subscribe to the service, you can virtually play your library from any device regardless of the specs. Also, you don’t have to worry about storage problems since these games are stored on a server.

It’s no joke when these companies tell you that you can play your games on any device. With their dedicated data servers, they make sure that the games run smoothly once you access them from the cloud. On your end, you will need a strong and consistent internet connection to play the games smoothly.

Several companies already have cloud gaming software available for people to subscribe to. Some examples include NVIDIA’s GeForce Now, Microsoft’s xCloud, and Google Stadia — all of which store PC games on a server. These companies even take the time to update their server hardware every so often to bring the best possible quality.

System requirements for cloud gaming

Much like your ordinary PC or gaming console, companies that run cloud gaming servers need certain equipment to run smoothly. First, these companies must set up active data centers and server farms that run the games. These data centers ensure that games are up and running, while reducing latency. In other words, these serve as the powerhouse of cloud gaming.

Next on the list is the network infrastructure necessary to send these to the users. To ensure that people don’t experience lags when they play their games, companies also invest in acquiring proper data connections. However, in most cases, this isn’t something these companies have control over; it’s mostly coming from their available internet service providers.

On the front-end, companies also provide dedicated hardware and software to house the cloud. For example, NVIDIA integrated GeForce Now into their own cloud streaming device, the NVIDIA Shield back in 2013. Meanwhile, Google Stadia relies heavily on using pre-existing Google software like Google Chrome and the Stadia App.

Something great to offer, for the most part

Cloud gaming services offer something unique in the industry. Essentially, it eliminates the user from investing so much into buying expensive PCs as it allows people to play from virtually any device. Whether it’s on a smartphone, laptop, or even a smart TV, people get access to games at high frame rates without an RTX 3080.

Furthermore, the game and save files are stored on the cloud, and don’t take up any storage on your devices. This is greatly beneficial for people who are already running on limited storage space, especially if they play Call of Duty: Warzone. With everything stored on the cloud, you don’t need most of the 512GB of SSD storage.

However, one of the biggest issues with cloud gaming revolves around the thing it’s based on: the internet. Specifically, it’s on the user’s internet connection as these services require the fastest internet to run smoothly on any device. Basically, you will need either an Ethernet or a 5G wireless connection to ensure the lowest latency possible.

That infrastructure isn’t readily available in most markets, which is a prominent issue among several third-world countries. Furthermore, even if there are companies that have 5G in their pipeline, these same providers also put data caps on it. Even if the user can play at an optimal frame rate, they’re doing so with a restriction in place.

Does this new player have any place?

With the world continuously opening its arms to the gaming industry, innovation becomes the forefront of success. Companies come up with a variety of gaming technologies that seek to cater to a wide variety of people. From individual hardware to pre-built systems, gaming often revolved around these things.

With cloud gaming, it gives people not just another option within the mix. Rather, it seeks to challenge the notion of availability and accessibility, and give it a viable solution. Essentially, it takes away the physical hardware limitations on the user’s end, and makes it available for everyone.

But like most gaming technologies, everything is still limited somehow. These systems still experience bottlenecks both on the manufacturer and the user’s end. In the end, it will depend on how much you’re willing to shell out for them, and how willing you are to accept the risks.

Illustrations by Raniedel Fajardo

Continue Reading

Trending