Over the past decade, devices using the Universal Serial Bus (USB) standard have become part of our daily lives. From transferring data to charging our devices, this standard has continued to evolve over time, with USB Type-C being the latest version. Here’s why you should care about it.
First, here’s a little history
Chances are you’ve encountered devices that have a USB port, such as a smartphone or computer. But what exactly is the USB standard? Simply put, it’s a communication protocol that allows devices to communicate with other devices using a standardized port or connector. It’s basically what language is for humans.
When USB was first introduced to the market, the connectors used were known as USB Type-A. You’re likely familiar with this connector; it’s rectangular and can only be plugged in a certain orientation. To be able to make a connection, a USB Type-A connector plugs into a USB Type-A port just like how an appliance gets connected to a wall outlet. This port usually resides on host devices such as computers and media players, while Type-A connectors are usually tied to peripherals such as keyboards or flash drives.
There are also USB Type-B connectors, and these usually go on the other end of a USB cable that plugs into devices like a smartphone. Due to the different sizes of external devices, there are a few different designs for Type-B connectors. Printers and scanners use the Standard-B port, older digital cameras and phones use the Mini-B port, and recent smartphones and tablets use the Micro-B port.
Specifications improved through the years
Aside from the type of connectors and ports, another integral part of the USB standard lies in its specifications. As with all specifications, these document the capabilities of the different USB versions.
The first-ever version of USB, USB 1.0, specified a transfer rate of up to 1.5Mbps (megabits per second), but this version never made it into consumer products. Instead, the first revision, USB 1.1, was released in 1998. It’s also the first version to be widely adopted and is capable of a max transfer rate of up to 12Mbps.
The next version, USB 2.0, was released in 2000. This version had a significantly higher transfer rate of up to 480Mbps. Both versions can also be used as power sources with a rating of 5V, 500mA or 5V, 100mA.
Next up was USB 3.0, which was introduced in 2008 and defines a transfer rate of up to 5Gbps (gigabits per second) — that’s a tenfold increase from the previous version. This feat was achieved by doubling the pin count or wires to make it easier to spot; these new connectors and ports are usually colored blue compared to the usual black/gray for USB 2.0 and below. USB 3.0 also improves upon its power delivery with a rating of 5V, 900mA.
In 2013, USB was updated to version 3.1. This version doubles what USB 3.0 was capable of in terms of bandwidth, as it’s capable of up to 10Gbps. The big change comes in its power delivery specification, now providing up to 20V, 5A, which is enough to power even notebooks. Apart from the higher power delivery, power direction is bidirectional this time around, meaning either the host or peripheral device can provide power, unlike before wherein only the host device can provide power.
Here’s a table of the different USB versions:
|Version||Bandwidth||Power Delivery||Connector Type|
|USB 1.0/1.1||1.5Mbps/12Mbps||5V, 500mA||Type-A to Type-A,
Type-A to Type-B
|USB 2.0||480Mbps||5V, 500mA||Type-A to Type-A,
Type-A to Type-B
|USB 3.0||5Gbps||5V, 900mA||Type-A to Type-A,
Type-A to Type-B
|USB 3.1||10Gbps||5V, up to 2A,
12V, up to 5A,
20V, up to 5A
|Type-C to Type-C,
Type-A to Type-C
Now that we’ve established the background of how USB has evolved from its initial release, there are two things to keep in mind: One, each new version of USB usually just bumps its transfer rate and power delivery, and two, there haven’t been any huge changes regarding the ports and connectors aside from the doubling of pin count when USB 3.0 was introduced. So, what’s next for USB?
USB Type-C isn’t your average connector
After USB 3.1 was announced, the USB Implementers Forum (USB-IF) who handles USB standards, followed it up with a new connector, USB Type-C. The new design promised to fix the age-old issue of orientation when plugging a connector to a port. There’s no “wrong” way when plugging a Type-C connector since it’s reversible. Another issue it addresses is how older connectors hinder the creation of thinner devices, which isn’t the case for the Type-C connector’s slim profile.
From the looks of it, the Type-C connector could become the only connector you’ll ever need in a device. It has high bandwidth for transferring 4K content and other large files, as well as power delivery that can power even most 15-inch notebooks. It’s also backwards compatible with previous USB versions, although you might have to use a Type-A-to-Type-C cable, which are becoming more common anyway.
Another big thing about USB Type-C is that it can support different protocols in its alternate mode. As of last year, Type-C ports are capable of outputting video via DisplayPort or HDMI, but you’ll have to use the necessary adapter and cable to do so. Intel’s Thunderbolt 3 technology is also listed as an alternate mode partner for USB Type-C. If you aren’t familiar with Thunderbolt, it’s basically a high-speed input/output (I/O) protocol that supports the transfer of both data and video on a single cable. Newer laptops have this built in.
Rapid adoption of the Type-C port has already begun, as seen on notebooks such as Chromebooks, Windows convertibles, and the latest Apple MacBook Pro line. Smartphones using the Type-C connector are also increasing in number.
Summing things up, the introduction of USB Type-C is a huge step forward when it comes to I/O protocols, as it can support almost everything a consumer would want for their gadgets: high-bandwidth data transfer, video output, and charging.
[irp posts=”9623″ name=”SSD and HDD: What’s the difference?”]
The secrets behind iPhone 13’s Cinematic Mode
Together with Apple’s VP for iPhone Product Marketing as well as their Human Interface Designer
For the first time ever, we had a three-way interview with Apple’s VP for iPhone Product Marketing, Kaiann Drance as well as one of their leading Human Interface Designers, Johnnie Manzari. If you’re not starstruck enough, both of them appeared in Apple’s September 2021 Keynote event!
Other than new camera sensors, newer camera features are also found on the new iPhone 13 Series. One of those is the new Cinematic Mode.
If you’ve watched some of our latest iPhone videos including the Sierra Blue iPhone 12 Pro Max unboxing, we’ve let you take a sneak peek on that new video mode.
We’re not gonna lie, it’s one amazing camera feature Apple has managed to deliver.
But what are the secrets behind it? And are you curious how technicalities work?
Watch our 16-minute interview with the Apple executives explaining why Cinematic Mode is the next big thing in mobile videography.
How Google alerted the Philippines during the July earthquake
Back in July, an earthquake rocked Metro Manila. Unbeknownst to most but noticed by some, a globally renowned company was helping everyone through the natural incident: Google. In the few minutes leading up to and during the 6.7 magnitude earthquake, Android users received important alerts warning them of the ongoing tremors. Though it wasn’t the dreaded Big One, the alert afforded attentive users a few precious seconds to either seek appropriate cover or stop doing dangerous tasks.
Incidentally, the tech surrounding Google’s earthquake alert system wasn’t just hastily built on ongoing databases or social media. Google actually packed in a fully responsive earthquake sensor for Android phones.
Faster than an earthquake
The forever-increasing speed of technology has always been a contentious element since the rise of smartphones. Developers and users alike have wondered how accurate or quick our favorite devices can warn us of things happening around us. There’s even an XKCD comic about how Twitter can warn us of an earthquake minutes before it reaches the reader.
Over the years, technology has developed new ways to deliver alerts. From simple weather apps to city-wide messaging systems, users can receive warnings in a timely fashion. Practically nothing is a surprise anymore with the right technology.
That said, Google has successfully developed a new system that can rely on other Android smartphones to accurately tell whether or not an earthquake is happening.
A quake detector in your pocket
Speaking to Android Police, the feature’s lead engineer Marc Stogaitis described how Google’s earthquake sensor leveraged other devices to tell users about the quake. It all revolves around the different sensors built inside your phone.
As it is, every smartphone comes with a host of sensors to support its different functions. A light detector can seamlessly adjust brightness and camera settings, and a gyroscope can support compasses, for example. With earthquakes, the biggest element to ponder on is a smartphone’s movement and vibrations during an earthquake.
According to the lead engineer, figuring out the metrics for detecting an earthquake wasn’t a problem. After decades of accurate seismograph technology, developers already have an idea on what they need to measure.
However, the technology does not stop there. Naturally, there are hiccups to relying on just a single (or even every) phone’s data. For one, a city-wide messaging system can set off everyone’s phone in a single area, potentially causing false positives. Plus, relying on a single phone is definitely tricky. There are multiple actions which can cause vibrations akin to an earthquake.
Crowdsourcing a quake
The feature doesn’t rely on just one phone. It doesn’t tap into every Android phone in an area either. Instead, it collates data from phones plugged into a charger. Naturally, a plugged-in phone is the most reliable barometer in terms of battery reliability. They won’t die out in the middle of an earthquake and ruin a source of data. Additionally, charging phones are often stationary. They won’t be affected by motions that mimic earthquakes.
Google “listens” to charging devices in an area. If the subset meets the criteria for an earthquake, the company quickly determines the earthquake’s epicenter (based on approximate location) and magnitude. Once the system declares that a quake is indeed happening, it sends out an alert to nearby devices and gives them the time needed to seek shelter.
The alerts naturally prioritize people nearer to the epicenter. But, of course, the speed will ultimately depend on the phone’s connectivity. A phone hooked up to a building’s fast Wi-Fi connection will receive alerts faster than a commuter’s phone on data while going through a tunnel.
Still, the short time that the alerts give users is enough to save themselves from a precarious situation. Though the feature can potentially warn users of quakes minutes in advance, Stogaitis says that it will more realistically push alerts five to ten seconds before the incident. However, five seconds is enough to go under a table and have some sort of protection against falling debris.
Still keeping things private
For anyone worrying about how Google is handling their data, Stogaitis says that the company removes all identifiers from the data except for approximate location. And, despite that, Google still maintains that the feature will be the most accurate that it can be. Either way, the feature will be useful for any earthquakes in the future.
The earthquake sensor is available for any Android phone running Lollipop and above. Naturally, the feature still necessitates that users turn on emergency alerts on their phone.
The industry’s next big thing: Cloud gaming explained
It’s gaming on the go, but for internet that’s not slow
Everybody’s getting into gaming these days, and you can’t blame them. With the pandemic continuing its ravaging ways in the world, people turn to their consoles or PCs for some action. However, not everyone can afford all the expensive PCs and the next-gen consoles when they come out.
Instead, a new player comes into the fray with a pretty great idea. What would happen if you can just play your favorite games from any device? Also, what if we told you that this won’t take up space on your device at all? This is basically what cloud gaming offers to you: a way to play games from any device at any time!
So, how does that actually work? What do you need to ensure quality gameplay, and should you even consider it?
The basics of playing on a cloud
On paper, it’s pretty easy to understand how cloud gaming works. Basically, you have access to a library of games from a cloud storage service. When you subscribe to the service, you can virtually play your library from any device regardless of the specs. Also, you don’t have to worry about storage problems since these games are stored on a server.
It’s no joke when these companies tell you that you can play your games on any device. With their dedicated data servers, they make sure that the games run smoothly once you access them from the cloud. On your end, you will need a strong and consistent internet connection to play the games smoothly.
Several companies already have cloud gaming software available for people to subscribe to. Some examples include NVIDIA’s GeForce Now, Microsoft’s xCloud, and Google Stadia — all of which store PC games on a server. These companies even take the time to update their server hardware every so often to bring the best possible quality.
System requirements for cloud gaming
Much like your ordinary PC or gaming console, companies that run cloud gaming servers need certain equipment to run smoothly. First, these companies must set up active data centers and server farms that run the games. These data centers ensure that games are up and running, while reducing latency. In other words, these serve as the powerhouse of cloud gaming.
Next on the list is the network infrastructure necessary to send these to the users. To ensure that people don’t experience lags when they play their games, companies also invest in acquiring proper data connections. However, in most cases, this isn’t something these companies have control over; it’s mostly coming from their available internet service providers.
On the front-end, companies also provide dedicated hardware and software to house the cloud. For example, NVIDIA integrated GeForce Now into their own cloud streaming device, the NVIDIA Shield back in 2013. Meanwhile, Google Stadia relies heavily on using pre-existing Google software like Google Chrome and the Stadia App.
Something great to offer, for the most part
Cloud gaming services offer something unique in the industry. Essentially, it eliminates the user from investing so much into buying expensive PCs as it allows people to play from virtually any device. Whether it’s on a smartphone, laptop, or even a smart TV, people get access to games at high frame rates without an RTX 3080.
Furthermore, the game and save files are stored on the cloud, and don’t take up any storage on your devices. This is greatly beneficial for people who are already running on limited storage space, especially if they play Call of Duty: Warzone. With everything stored on the cloud, you don’t need most of the 512GB of SSD storage.
However, one of the biggest issues with cloud gaming revolves around the thing it’s based on: the internet. Specifically, it’s on the user’s internet connection as these services require the fastest internet to run smoothly on any device. Basically, you will need either an Ethernet or a 5G wireless connection to ensure the lowest latency possible.
That infrastructure isn’t readily available in most markets, which is a prominent issue among several third-world countries. Furthermore, even if there are companies that have 5G in their pipeline, these same providers also put data caps on it. Even if the user can play at an optimal frame rate, they’re doing so with a restriction in place.
Does this new player have any place?
With the world continuously opening its arms to the gaming industry, innovation becomes the forefront of success. Companies come up with a variety of gaming technologies that seek to cater to a wide variety of people. From individual hardware to pre-built systems, gaming often revolved around these things.
With cloud gaming, it gives people not just another option within the mix. Rather, it seeks to challenge the notion of availability and accessibility, and give it a viable solution. Essentially, it takes away the physical hardware limitations on the user’s end, and makes it available for everyone.
But like most gaming technologies, everything is still limited somehow. These systems still experience bottlenecks both on the manufacturer and the user’s end. In the end, it will depend on how much you’re willing to shell out for them, and how willing you are to accept the risks.
Amazon Kindle PaperWhite Signature Edition Review
The best Kindle for every bookworm?
AirPods 3 Unboxing and Review
Watch before you buy!
Hot Wheels Unleashed review: A childhood dream realized
It’s action-packed and rather nostalgic
Amazon Kindle PaperWhite Signature Edition Review
Honor 50 and Honor 50 Lite launch globally
Players finally meet as HoYo FEST kicks off in SEA this November
The Xperia PRO-I is Sony’s first smartphone with a 1-inch camera sensor
War journalists use Call of Duty Vanguard to capture life-like photos
Mercury in retrograde: When technology and communications go haywire
Xiaomi Pad 5: It is worth the hype
Lenovo Legion Phone Duel 2: Feel like a pro
Apple iPhone 13 series: Price, availability in the Philippines
Why the Samsung Crystal UHD 2021 is a must have in your 20s
Gaming2 weeks ago
Nintendo Switch OLED Unboxing and Review
Unboxing2 weeks ago
Apple Watch Series 7 Unboxing: Starlight the new Silver or Gold?
Reviews2 weeks ago
Apple Watch Series 7: In-Depth Fitness Review
Gaming2 weeks ago
Apple reportedly working on a Nintendo Switch competitor
Reviews2 weeks ago
vivo X70 review: Sexy, camera beast
Videos2 weeks ago
Amazfit GTR 3 Pro: 7 Best Features!
Laptops2 weeks ago
realme Book review: Done right the first time around
Laptops1 week ago
Apple announces new MacBook Pros powered by M1 Pro, M1 Max