Explainers

Where did the 18:9 ratio come from?

And should you get an 18:9 phone?

Published

on

2018 will be the year of the notched, bezel-less display. Along with it comes a new number that not everyone knows what to make of yet — the 18:9 aspect ratio.

Smartphones are quickly adapting the new 18:9 standard. However, since the ratio is still in its relative infancy, you might have heard more about its smaller variant, 16:9. Since the invention of widescreen TVs, everyone took in 16:9 as the industry standard. With 18:9 just on the horizon, should we care that our phones are getting taller?

What is aspect ratio?

First, let’s define what an aspect ratio is. The two numbers describe how big a device’s screen or a piece of media is. Specifically, it compares how wide your screen is relative to its height. The larger number is its height, while the smaller is its width.

For example, the square photos on Instagram have an aspect ratio of 1:1. As it gets taller or wider, its corresponding number on the ratio increases. It can vary from the old 4:3 to the ubiquitous 16:9 to the new 18:9.

Where did it come from?

The history of aspect ratios has always been intimately linked to the movie industry. The art of experimenting with aspect ratios began as cinematographers trying to perfect their film’s vision. With every new experiment, a new aspect ratio is born. Sometimes, their experiments become popular enough to become an industry standard.

The world’s first documented aspect ratio, 4:3, is also one of the world’s most popular ones. Remember those bulky CRT monitors you used to have? Those used the 4:3 ratio, which was adapted from old-school cinema. The first films used the film negative’s perforations (or holes) to measure their screens. In 4:3’s case, the screen was three perforations tall and four perforations wide.

When television was invented, the world of cinema faced tremendous competition. At the time, TVs also started off with a 4:3 display. Naturally, the homely convenience of a TV placed it at an advantage over the inconvenience of driving to a movie theater. The film industry had to compete.

Going head-to-head with the TV, cinematographers invented wider and taller aspect ratios. From this era, we saw the invention of the 70mm film. These huge ratios could fit more content on the screen. They became so popular that 70mm is still a standard that’s used today.

Our old friend, 16:9, arrived shortly after this boom. With aspect ratios popping out of nowhere, there came a need for a standard that everyone could follow. For this, Dr. Kern Powers, an expert at the craft, proposed the 16:9 format, a compromise between the industry’s most used ratios. With this format, you can watch either a TV show or a movie with minimal letterboxing (or the black rectangles on the edges of your screen).

Its flexibility skyrocketed the ratio into ubiquity. A lot of screens adapted the ratio as a result. Years later, almost every device prior to 2017 used a 16:9 display. Even now, the ratio that we’re most familiar with is 16:9.

Now, if 16:9 so effective, where did 18:9 come from?

The birth of 18:9

Because of the massive popularity of 16:9, 4:3 displays became obsolete. Today, finding a 4:3 device involves a trip to the nostalgia store. 16:9 began as a compromise. Since everyone adapted the compromise, it became a norm. The feud now was between HDTVs and cinemas.

Hence, the gap between portable 16:9 screens and cinema’s 70mm still exists. The next logical step is to create a compromise between 16:9 and the current 70mm cinema standard.

In 1998, cinematographer Vittorio Storaro solved this by inventing the Univisium film format, or what we know now as 18:9. Seeing the need for a new standard, he saw 18:9 as a standard that can make both cinemas and TVs happy.

At the time of his invention, only a handful of films used his new standard. In fact, most of them like Exorcist: The Beginning were his own films. Univisium would lay low for over a decade.

The rise of 18:9

In 2013, Univisium entered a renaissance with hit streaming show House of Cards, which was shot in the format. Having found a new home, Univisium crawled its way into other shows like Stranger Things and Star Trek: Discovery. In some circles, 18:9 was already known as the “streaming ratio.”

With the effectiveness of 18:9 proven, devices started adopting the new ratio. In 2017, the LG G6 and the Samsung Galaxy S8 launched with 18:9 in tow. (The S8 would use a slightly adjusted 18.5:9.)

After the trendsetters, more phones started getting into the trend. Google, OnePlus, and Huawei would soon adapt the new ratio. Even the Apple iPhone X uses a taller ratio: 19.5:9.

Should you get an 18:9 phone?

As it’s still in its infancy, the usefulness of 18:9 isn’t as apparent. However, the ratio already carries a flurry of benefits for early adopters.

Firstly, getting an 18:9 phone ensures future-proofing for a standard that’s quickly gaining traction. More shows are using 18:9. Even Hollywood is already testing the waters. 2015 film Jurassic World used the ratio.

Secondly, Univisium optimizes existing smartphone features like split-screen view. With more real estate, two apps can easily share the screen for a true multitasking experience. Even without the feature, phones can display more content in one screen without scrolling.

It’s likely that you won’t see the benefits until further down the line. The shows that use 18:9 are still too few to call it a true standard. When you watch a contemporary video on an 18:9 screen, you’ll still notice letterboxing. Despite their flexibility, some apps might even have trouble stretching to 18:9.

The vision of 18:9 is still in the future. It will have its growing pains. Critics will even put it down as a fad. However, the new ratio shows a lot of promise in uniting content under one pleasing ratio.

Illustrations by MJ Jucutan

Apps

How Google alerted the Philippines during the July earthquake

Crowd-sourcing data

Published

on

Illustrations by Kris Blanco

Back in July, an earthquake rocked Metro Manila. Unbeknownst to most but noticed by some, a globally renowned company was helping everyone through the natural incident: Google. In the few minutes leading up to and during the 6.7 magnitude earthquake, Android users received important alerts warning them of the ongoing tremors. Though it wasn’t the dreaded Big One, the alert afforded attentive users a few precious seconds to either seek appropriate cover or stop doing dangerous tasks.

Incidentally, the tech surrounding Google’s earthquake alert system wasn’t just hastily built on ongoing databases or social media. Google actually packed in a fully responsive earthquake sensor for Android phones.

Faster than an earthquake

The forever-increasing speed of technology has always been a contentious element since the rise of smartphones. Developers and users alike have wondered how accurate or quick our favorite devices can warn us of things happening around us. There’s even an XKCD comic about how Twitter can warn us of an earthquake minutes before it reaches the reader.

Over the years, technology has developed new ways to deliver alerts. From simple weather apps to city-wide messaging systems, users can receive warnings in a timely fashion. Practically nothing is a surprise anymore with the right technology.

That said, Google has successfully developed a new system that can rely on other Android smartphones to accurately tell whether or not an earthquake is happening.

A quake detector in your pocket

Speaking to Android Police, the feature’s lead engineer Marc Stogaitis described how Google’s earthquake sensor leveraged other devices to tell users about the quake. It all revolves around the different sensors built inside your phone.

As it is, every smartphone comes with a host of sensors to support its different functions. A light detector can seamlessly adjust brightness and camera settings, and a gyroscope can support compasses, for example. With earthquakes, the biggest element to ponder on is a smartphone’s movement and vibrations during an earthquake.

According to the lead engineer, figuring out the metrics for detecting an earthquake wasn’t a problem. After decades of accurate seismograph technology, developers already have an idea on what they need to measure.

However, the technology does not stop there. Naturally, there are hiccups to relying on just a single (or even every) phone’s data. For one, a city-wide messaging system can set off everyone’s phone in a single area, potentially causing false positives. Plus, relying on a single phone is definitely tricky. There are multiple actions which can cause vibrations akin to an earthquake.

Crowdsourcing a quake

The feature doesn’t rely on just one phone. It doesn’t tap into every Android phone in an area either. Instead, it collates data from phones plugged into a charger. Naturally, a plugged-in phone is the most reliable barometer in terms of battery reliability. They won’t die out in the middle of an earthquake and ruin a source of data. Additionally, charging phones are often stationary. They won’t be affected by motions that mimic earthquakes.

Google “listens” to charging devices in an area. If the subset meets the criteria for an earthquake, the company quickly determines the earthquake’s epicenter (based on approximate location) and magnitude. Once the system declares that a quake is indeed happening, it sends out an alert to nearby devices and gives them the time needed to seek shelter.

The alerts naturally prioritize people nearer to the epicenter. But, of course, the speed will ultimately depend on the phone’s connectivity. A phone hooked up to a building’s fast Wi-Fi connection will receive alerts faster than a commuter’s phone on data while going through a tunnel.

Still, the short time that the alerts give users is enough to save themselves from a precarious situation. Though the feature can potentially warn users of quakes minutes in advance, Stogaitis says that it will more realistically push alerts five to ten seconds before the incident. However, five seconds is enough to go under a table and have some sort of protection against falling debris.

Still keeping things private

For anyone worrying about how Google is handling their data, Stogaitis says that the company removes all identifiers from the data except for approximate location. And, despite that, Google still maintains that the feature will be the most accurate that it can be. Either way, the feature will be useful for any earthquakes in the future.

The earthquake sensor is available for any Android phone running Lollipop and above. Naturally, the feature still necessitates that users turn on emergency alerts on their phone.

Continue Reading

Explainers

The industry’s next big thing: Cloud gaming explained

It’s gaming on the go, but for internet that’s not slow

Published

on

Everybody’s getting into gaming these days, and you can’t blame them. With the pandemic continuing its ravaging ways in the world, people turn to their consoles or PCs for some action. However, not everyone can afford all the expensive PCs and the next-gen consoles when they come out.

Instead, a new player comes into the fray with a pretty great idea. What would happen if you can just play your favorite games from any device? Also, what if we told you that this won’t take up space on your device at all? This is basically what cloud gaming offers to you: a way to play games from any device at any time!

So, how does that actually work? What do you need to ensure quality gameplay, and should you even consider it?

The basics of playing on a cloud

On paper, it’s pretty easy to understand how cloud gaming works. Basically, you have access to a library of games from a cloud storage service. When you subscribe to the service, you can virtually play your library from any device regardless of the specs. Also, you don’t have to worry about storage problems since these games are stored on a server.

It’s no joke when these companies tell you that you can play your games on any device. With their dedicated data servers, they make sure that the games run smoothly once you access them from the cloud. On your end, you will need a strong and consistent internet connection to play the games smoothly.

Several companies already have cloud gaming software available for people to subscribe to. Some examples include NVIDIA’s GeForce Now, Microsoft’s xCloud, and Google Stadia — all of which store PC games on a server. These companies even take the time to update their server hardware every so often to bring the best possible quality.

System requirements for cloud gaming

Much like your ordinary PC or gaming console, companies that run cloud gaming servers need certain equipment to run smoothly. First, these companies must set up active data centers and server farms that run the games. These data centers ensure that games are up and running, while reducing latency. In other words, these serve as the powerhouse of cloud gaming.

Next on the list is the network infrastructure necessary to send these to the users. To ensure that people don’t experience lags when they play their games, companies also invest in acquiring proper data connections. However, in most cases, this isn’t something these companies have control over; it’s mostly coming from their available internet service providers.

On the front-end, companies also provide dedicated hardware and software to house the cloud. For example, NVIDIA integrated GeForce Now into their own cloud streaming device, the NVIDIA Shield back in 2013. Meanwhile, Google Stadia relies heavily on using pre-existing Google software like Google Chrome and the Stadia App.

Something great to offer, for the most part

Cloud gaming services offer something unique in the industry. Essentially, it eliminates the user from investing so much into buying expensive PCs as it allows people to play from virtually any device. Whether it’s on a smartphone, laptop, or even a smart TV, people get access to games at high frame rates without an RTX 3080.

Furthermore, the game and save files are stored on the cloud, and don’t take up any storage on your devices. This is greatly beneficial for people who are already running on limited storage space, especially if they play Call of Duty: Warzone. With everything stored on the cloud, you don’t need most of the 512GB of SSD storage.

However, one of the biggest issues with cloud gaming revolves around the thing it’s based on: the internet. Specifically, it’s on the user’s internet connection as these services require the fastest internet to run smoothly on any device. Basically, you will need either an Ethernet or a 5G wireless connection to ensure the lowest latency possible.

That infrastructure isn’t readily available in most markets, which is a prominent issue among several third-world countries. Furthermore, even if there are companies that have 5G in their pipeline, these same providers also put data caps on it. Even if the user can play at an optimal frame rate, they’re doing so with a restriction in place.

Does this new player have any place?

With the world continuously opening its arms to the gaming industry, innovation becomes the forefront of success. Companies come up with a variety of gaming technologies that seek to cater to a wide variety of people. From individual hardware to pre-built systems, gaming often revolved around these things.

With cloud gaming, it gives people not just another option within the mix. Rather, it seeks to challenge the notion of availability and accessibility, and give it a viable solution. Essentially, it takes away the physical hardware limitations on the user’s end, and makes it available for everyone.

But like most gaming technologies, everything is still limited somehow. These systems still experience bottlenecks both on the manufacturer and the user’s end. In the end, it will depend on how much you’re willing to shell out for them, and how willing you are to accept the risks.

Illustrations by Raniedel Fajardo

Continue Reading

Explainers

Your MagSafe Questions Answered

Do you really need it?

Published

on

If you’ve ever owned an old MacBook before, you’ll know that those chargers magnetically snap onto place. That particular technology is called the ‘MagSafe’.

After the MacBook Pro touch bar and USB-C overhaul last 2016, everyone thought MagSafe ended for good. Not until they announced the new MagSafe for the iPhone 12 series four years later.

The MagSafe technology might not be new but the implementation for the latest iPhones makes the technology even more usable. Other than the securely-placed phone for wireless charging, there are a plethora of case manufacturers who continuously work on future accessories that support MagSafe existing ecosystem.

But is the Apple MagSafe more than just a gimmick? And do you really need it?

Watch our in-depth Apple MagSafe explainer here.

Continue Reading

Trending