Explainers

Why is USB Type-C so important?

Published

on

Over the past decade, devices using the Universal Serial Bus (USB) standard have become part of our daily lives. From transferring data to charging our devices, this standard has continued to evolve over time, with USB Type-C being the latest version. Here’s why you should care about it.

First, here’s a little history

Chances are you’ve encountered devices that have a USB port, such as a smartphone or computer. But what exactly is the USB standard? Simply put, it’s a communication protocol that allows devices to communicate with other devices using a standardized port or connector. It’s basically what language is for humans.

Here’s an example of a USB hub that uses Type-A connectors (Image credit: Anker)

When USB was first introduced to the market, the connectors used were known as USB Type-A. You’re likely familiar with this connector; it’s rectangular and can only be plugged in a certain orientation. To be able to make a connection, a USB Type-A connector plugs into a USB Type-A port just like how an appliance gets connected to a wall outlet. This port usually resides on host devices such as computers and media players, while Type-A connectors are usually tied to peripherals such as keyboards or flash drives.

There are also USB Type-B connectors, and these usually go on the other end of a USB cable that plugs into devices like a smartphone. Due to the different sizes of external devices, there are a few different designs for Type-B connectors. Printers and scanners use the Standard-B port, older digital cameras and phones use the Mini-B port, and recent smartphones and tablets use the Micro-B port.

Samples of the different USB Type-B connectors. From left to right: Standard-B, Mini-B, and Micro-B (Image credit: Amazon)

Specifications improved through the years

Aside from the type of connectors and ports, another integral part of the USB standard lies in its specifications. As with all specifications, these document the capabilities of the different USB versions.

The first-ever version of USB, USB 1.0, specified a transfer rate of up to 1.5Mbps (megabits per second), but this version never made it into consumer products. Instead, the first revision, USB 1.1, was released in 1998. It’s also the first version to be widely adopted and is capable of a max transfer rate of up to 12Mbps.

The next version, USB 2.0, was released in 2000. This version had a significantly higher transfer rate of up to 480Mbps. Both versions can also be used as power sources with a rating of 5V, 500mA or 5V, 100mA.

Next up was USB 3.0, which was introduced in 2008 and defines a transfer rate of up to 5Gbps (gigabits per second) — that’s a tenfold increase from the previous version. This feat was achieved by doubling the pin count or wires to make it easier to spot; these new connectors and ports are usually colored blue compared to the usual black/gray for USB 2.0 and below. USB 3.0 also improves upon its power delivery with a rating of 5V, 900mA.

In 2013, USB was updated to version 3.1. This version doubles what USB 3.0 was capable of in terms of bandwidth, as it’s capable of up to 10Gbps. The big change comes in its power delivery specification, now providing up to 20V, 5A, which is enough to power even notebooks. Apart from the higher power delivery, power direction is bidirectional this time around, meaning either the host or peripheral device can provide power, unlike before wherein only the host device can provide power.

Here’s a table of the different USB versions:

Version Bandwidth Power Delivery Connector Type
USB 1.0/1.1 1.5Mbps/12Mbps 5V, 500mA Type-A to Type-A,

Type-A to Type-B

USB 2.0 480Mbps 5V, 500mA Type-A to Type-A,

Type-A to Type-B

USB 3.0 5Gbps 5V, 900mA Type-A to Type-A,

Type-A to Type-B

USB 3.1 10Gbps 5V, up to 2A,

12V, up to 5A,

20V, up to 5A

Type-C to Type-C,

Type-A to Type-C

Now that we’ve established the background of how USB has evolved from its initial release, there are two things to keep in mind: One, each new version of USB usually just bumps its transfer rate and power delivery, and two, there haven’t been any huge changes regarding the ports and connectors aside from the doubling of pin count when USB 3.0 was introduced. So, what’s next for USB?

USB Type-C isn’t your average connector

After USB 3.1 was announced, the USB Implementers Forum (USB-IF) who handles USB standards, followed it up with a new connector, USB Type-C. The new design promised to fix the age-old issue of orientation when plugging a connector to a port. There’s no “wrong” way when plugging a Type-C connector since it’s reversible. Another issue it addresses is how older connectors hinder the creation of thinner devices, which isn’t the case for the Type-C connector’s slim profile.

Here’s how a USB Type-C connector looks like. Left: Type-A to Type-C cable, Right: Type-C to Type-C cable (Image credit: Belkin)

From the looks of it, the Type-C connector could become the only connector you’ll ever need in a device. It has high bandwidth for transferring 4K content and other large files, as well as power delivery that can power even most 15-inch notebooks. It’s also backwards compatible with previous USB versions, although you might have to use a Type-A-to-Type-C cable, which are becoming more common anyway.

Another big thing about USB Type-C is that it can support different protocols in its alternate mode. As of last year, Type-C ports are capable of outputting video via DisplayPort or HDMI, but you’ll have to use the necessary adapter and cable to do so. Intel’s Thunderbolt 3 technology is also listed as an alternate mode partner for USB Type-C. If you aren’t familiar with Thunderbolt, it’s basically a high-speed input/output (I/O) protocol that supports the transfer of both data and video on a single cable. Newer laptops have this built in.

A USB Type-C Thunderbolt 3 port (with compatible dock/adapter) does everything you’ll ever need when it comes to I/O ports (Image credit: Intel)

Rapid adoption of the Type-C port has already begun, as seen on notebooks such as Chromebooks, Windows convertibles, and the latest Apple MacBook Pro line. Smartphones using the Type-C connector are also increasing in number.

Summing things up, the introduction of USB Type-C is a huge step forward when it comes to I/O protocols, as it can support almost everything a consumer would want for their gadgets: high-bandwidth data transfer, video output, and charging.

SEE ALSO: SSD and HDD: What’s the difference?

[irp posts=”9623″ name=”SSD and HDD: What’s the difference?”]

Explainers

No more cords: Wireless charging explained

More and more things are going wireless

Published

on

A lot of things have gone wireless over the past few years. From internet connections to gaming with your friends, the world is becoming more accessible without the need for physical wires. Over the course of 2018, another aspect of our lives has gone this route: charging one’s device.

Perhaps you’ve already heard of wireless charging and its presence in today’s smartphones, particularly the latest Apple devices. You may have even owned something that could wirelessly charge devices. But, what is wireless charging all about?

Let’s break down the technicalities

Wireless charging is a highly technical concept in the world of electronics. Basically, the way it works is that your charging pad contains coils that give off electromagnetic fields. These fields carry energy with them, which can be converted into electricity to power up the compatible device when placed on the pad. 

There are two ways devices can wirelessly charge: inductive charging and resonance charging. Inductive charging is mostly present in low-power charging devices, or ones that require less electricity to power up. This form is limited in range, to the point that the only way your phone charges is if it’s on the pad. Resonance charging, on the other hand, maximizes the range but lessens the amount of charge transferred.

Induction charging

Within the last ten years, several non-profit organizations have created and set wireless charging standards for companies to follow. The most popular of which is the Qi standard established in 2008 by the Wireless Power Consortium (WPC). Other standards include the Power Matters Alliance (PMA) standard in 2012, and Rezence by Alliance for Wireless Power (A4WP) from 2012 to 2015.

All about that Qi

As mentioned earlier, the Qi standard is the most popular wireless charging standard in the world. Most of today’s smartphones and peripherals are supported by Qi. It was established in 2008, with smartphones first adopting it in 2012 through the Nokia Lumia 920.

Qi focuses primarily on energy regulation. Most charging pads that use this standard work with flat surfaces for better energy distribution. Chargers with the Qi standard regulate the amount of charge they give to devices, and immediately go on standby once full. These chargers only activate once a device is placed on top, saving on the cost of electricity in the process.

Magnetic resonance charging

Most smartphone companies have made the choice to implement the Qi standard in their latest models. Apart from Nokia, companies like LG and Samsung have adopted it beginning with the LG Nexus 4 and Samsung Galaxy S6, respectively. In 2017, Apple accepted the standard with the release of their iPhone 8, iPhone 8 Plus, and iPhone X. The company also planned a charging mat called AirPower that could charge multiple devices all at once, but it has yet to be launched.

Why do most companies prefer Qi, but some don’t?

The goal of the WPC is to put forward one standard for wireless charging in the world. The organization developed the Qi standard in such a way that companies are able to integrate them into their products seamlessly. It’s because of this standard that smartphones are aligned to wireless charging pads through magnets for better charging capacity.

Apart from that, the Qi standard allows for more intelligent control over charging your phone. It can tell if your phone is fully charged and will stop sending electricity to avoid overdoing it. Of course, you’ll be able to maximize the charging capacity of your Qi wireless charger if you’re only charging one device at a time.

Wireless charges for the Razer Phone 2, Google Pixel 3, and Xiaomi Mi Mix 3

However, some companies recognize that most people own several smart devices. This is where other organizations like Power Matters Alliance come in. PMA initially used inductive charging as their base for wireless charging, which is what Qi uses, as well. Now, that same organization was able to look into resonance charging, which removes the limitation Qi has.

That’s one of the reasons why Samsung, for example, incorporated both Qi and PMA standards into their Samsung Galaxy S6. With resonance charging, devices can be charged a few centimeters away from the pad. This is especially good for people who use their phones while charging. While WPC is looking to incorporate resonance charging into Qi, certain factors and compatibility issues with devices make the standard less effective.

What does the future hold for wireless charging?

With all the talk about standards and devices, there’s no denying that wireless charging is here to stay. There are talks between the WPC and PMA on possibly coming up with just one true standard for all companies to follow. The best part is that it doesn’t stop there.

Both organizations are looking to expand their technologies beyond smartphones and consumer devices. WPC has already done so with furniture retailers like IKEA to apply wireless charging peripherals to office tables and couches. Meanwhile, PMA is looking to introduce wireless charging to restaurants and establishments, like McDonald’s and Starbucks with wireless-charging tables. It even reached a point wherein tech startups are developing their own hardware for wireless charging from longer distances.

It’s safe to say that the future is definitely bright for wireless charging. Whether companies will start making it a must-have feature for all their products remains to be seen.

Illustrations by MJ Jucutan

Continue Reading

Explainers

Here’s what you need to know about eSIM

The technology behind Apple’s first dual-SIM iPhone

Published

on

When Apple first revealed their new iPhone XS and iPhone XS Max, people were expecting something different. While on the outside nothing seems to have changed, the inside is a whole different story. The most notable change is the introduction of eSIM (embedded SIM) technology, something that they’ve done before with the Apple Watch.

But, what is this eSIM? How different is it from the SIM card that you know and love? And does using an eSIM change the game completely?

Let’s talk about the SIM and eSIM

One of the essentials for any phone in the market is a SIM card. Short for Subscriber Identity Module, a SIM card contains key identification and security features from any network carrier. It is used by these networks to identify their consumers and provide mobile connectivity for them — through calls, texts, and access to the internet. SIM cards also allow you to store information when you decide to switch devices every now and then.

eSIM technology, as the name implies, is embedded into the phone yet it still keeps the same functionalities as before. On devices that were designed with only one SIM card slot, adding an eSIM makes it a virtual dual-SIM machine. 

How have regions adopted eSIM?

As mentioned earlier, this isn’t the first time Apple dealt with eSIM tech. The company had initially launched the eSIM for their Apple Watch Series 3 to give it better connectivity on the go. While Apples continues to incorporate eSIM in its newer Watch Series 4, they’ve decided to take it one step further with the iPhone XS and iPhone XS Max.

However, as of writing, only ten countries in the entire world currently support eSIM. This is mostly due to these countries having the proper infrastructure to support the use of it. While smartphone companies are looking to incorporate this new technology, the market for it seems to be relatively small.

The good and bad about eSIM

Like any other new technology, eSIM comes with its own set of benefits and difficulties — especially for those transitioning from the traditional SIM card. With eSIM installed in your phone, users will no longer have to go through the hassle of buying a specific SIM card.

Ideally, having an eSIM also allows you to switch between networks easily. Apart from an eSIM-capable phone, it also comes with the needed software to make the switching process faster and easier. In essence, you will be able to free up the allocated SIM card slot for a physical SIM card if your device supports it. This is most helpful when you travel abroad, and you need a local number in that country to access their network.

However, there are some processes that prove to be difficult with eSIM, one of which is quickly transferring your phone number to another phone, especially if you frequently switch devices. Unlike traditional SIM cards wherein you just transfer the card, you’d have to contact your service provider to activate the number in your new phone. This could be cumbersome depending on your provider’s customer service.

Furthermore, if the eSIM in your phone becomes corrupted or gets damaged in any way, it’s possible that you would need to replace your whole phone. Because the eSIM is integrated inside your phone, it won’t be easy to pry it out when things go wrong. This wouldn’t be too big of a concern for traditional SIM cards, especially when the card gets destroyed.

Are smartphones ready for the eSIM?

The eSIM technology is still in its young stages, and only a handful of devices currently support it. There is potential for the tech to be implemented across more devices in the future despite only a few countries welcoming them. However, a lot of people still primarily utilize traditional SIM cards given the difficulties of using an eSIM.

In the case of the new iPhones, for example, you can’t create two instances of chat apps on iOS. So even if you have two numbers running at the same time, you’d need a separate phone for another WhatsApp or Viber number, until Apple comes up with a software patch for this.

In the end, the technology’s impact can only be measured once more devices embrace it. But, for now, let’s celebrate how the eSIM gave us the first dual-SIM iPhone and see where the future will take us.

Illustrations by MJ Jucutan

Continue Reading

Explainers

All filters: Article 13 of the EUCD explained

Is this the end for memes everywhere?

Published

on

If you haven’t been on the web often lately, this may be something that has slipped past your radar. On September 12, 2018, the European Parliament voted to pass a directive that could change the way we approach the internet for years to come. But, consider first that it’s only the initial review, with a final vote happening next year.

What is this directive, and why is the internet involved? Why are people suddenly seeing #Article13 trend on Twitter a few hours after the decision was passed? What’s with this #SaveTheInternet nonsense?

Understanding the copyright directive

The directive at the forefront of this entire debacle is known as the European Union Copyright Directive, or EUCD. The EUCD hopes to streamline effective regulations towards the protection of intellectual property in the EU. It was first adopted in 2001, following the ruling during the 1996 World Intellectual Property Organization Copyright Treaty. Earlier this year, another version of the directive was drafted with added articles and stipulations.

Basically, the EUCD seeks to create measures to protect one’s copyright on created content. The range of intellectual property that should be protected include music, videos, images, algorithms/codes, and even software. The directive calls for member countries to enact and implement laws that protect copyright owners. Eventually, such stipulations also reach big companies that operate within the EU.

You might be thinking why there is an outcry over it in the first place, especially when the directive’s purpose is clear. Well, there’s one particular part of the EUCD that a lot of people disagree on: Article 13.

The unlucky Article 13

Article 13 of the EUCD isn’t a lengthy piece of reading. The whole article contains three provisions for the implementation of copyright protection on websites that host user-generated content. The directive makes a note that these websites store large amounts of user-generated content, with the main purpose, if not one of its main purposes, of earning profit. Basically, any website that allows you to upload your own content and allows you to earn money from it is affected by the directive.

The article also cites that such websites should create measures such as “effective content recognition technologies,” complaint management systems, and tracking solutions. These measures should be readily available the moment users upload content on the website itself. With such measures taken into account, it allows content creators and service providers to properly engage in discussions should there be a dispute. It’s basically what YouTube Creators is all about.

Websites like YouTube, Twitch, Facebook, and Twitter, as well as streaming apps such as Spotify, Apple Music, and IGTV (when monetization is available) are most likely the article’s main targets. The directive also explicitly states that non-profit service providers and online marketplaces will not be affected. So, Wikipedia and Shopee aren’t affected, don’t worry.

The ongoing debate towards copyright protection

For some people, the EUCD is inherently good for intellectual property protection. They argue that the primary goal of the directive is to protect users from piracy and copyright infringement. Through the EUCD, there will be systems in place that protect music labels, content creators, and publishers from any illegal use of their content online. For these people, users should be held liable for infringement of any kind (memes, remixes, and parodies are a few examples).

Furthermore, the directive not only affects users but also the companies that run these websites. It basically mandates companies to create better content recognition systems, or change their already existing system for stricter copyright protection. If they don’t make adjustments, they will be held liable for any infringement-related issues. What Article 13 does, for those who are for the EUCD, is simply a suggested improvement.

However, there are others who believe that the directive is a little too extreme and could potentially do more harm than good. Leading institutions and companies in the tech industry think that the provisions are too vague, leaving it open for interpretation. This has the potential for companies to abuse copyright claims without effective ways of intervention. Furthermore, any significant changes to already-existing systems would require heavy costs to implement.

The bigger picture here is how the directive affects the internet as a whole. Big names in the tech industry argue that it’s an attack on the creative freedom of users. Instead of allowing the internet to be an open space for the right way of creativity, it simply adds more filters and restrictions in the process. Basically, you can’t put up an Avengers meme without having the approval of Disney and Marvel Studios first.

So, what happens now?

The EUCD was put in place to protect copyright — a simple and basic goal. There is recognition that there are measures that must be in place to uphold copyright. There is no denying that big companies have to abide by intellectual property rules, or suffer severe consequences for infringement. However, a lot of people are clamoring that these measures are both vague and sound extreme. Not only does the directive infringe one’s creative freedom in providing quality content, but it also makes the whole process costly and rigid.

At the end of the day, everybody wants to protect copyright. The argument for or against the EUCD is already past the debate on whether protecting copyright is right or wrong. The debate now is whether or not a open source like the internet should be kept that way or be strictly protected at all costs.

All of these will come into play in January 2019, when the European Parliament casts its vote for or against the directive. If you have the time to read the EUCD, you can access the full document here.

Continue Reading

Trending