Category Archives: Telecommunications

5G Wireless Defined

In my 20 April 2016 post, “5G is Coming, Slowly,” I discussed the evolution of mobile communications technology and the prospects for the deployment of the next generation: 5G. The complexity of 5G service relative to current generation 4G (LTE) service is daunting because of rapidly increasing technical demands that greatly exceed LTE core capabilities. Examples of technical drivers for 5G include the population explosion in the Internet of Things (IoT), the near-term deployment of operational self-driving cars, and the rise of virtual and augmented reality mobile applications.

Progress toward 5G is steadily being made. Here’s a status update.

1. International Telecommunications Union (ITU) technical performance requirements

The ITU is responsible for international standardization of mobile communications technologies. On 23 February 2017, the ITU released a draft report containing their current consensus definition of the minimum technical performance requirements for 5G wireless (IMT-2020) radio service.

The ITU authors note:

“….the capabilities of IMT-2020 are identified, which aim to make IMT-2020 more flexible, reliable and secure than previous IMT when providing diverse services in the intended three usage scenarios, including enhanced mobile broadband (eMBB), ultra-reliable and low-latency communications (URLLC), and massive machine type communications (mMTC).”

This ITU’s draft technical performance requirements report is a preliminary document that is a product of the second stage of the ITU’s standardization process for 5G wireless deployment, which is illustrated below:

ITU-IMT2020 roadmap crop

Source: ITU

The draft technical performance requirements report provides technical definitions and performance specifications in each of the following categories:

  • Peak data rate
  • Peak spectral efficiency (bits per hertz of spectrum)
  • User experience data rate
  • 5th percentile user spectral efficiency
  • Average spectral efficiency
  • Area traffic capacity
  • Latency
  • Connection density
  • Energy efficiency
  • Reliability
  • Mobility
  • Mobility interruption time
  • Bandwidth

You’ll find a good overview of the ITU’s draft performance requirements in an article by Sebastian Anthony entitled, “5G Specs Announced: “20 Gbps download, 1 ms latency, 1M device per square km,” at the following link:

https://arstechnica.com/information-technology/2017/02/5g-imt-2020-specs/?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

You can download the ITU’s draft report, entitled “DRAFT NEW REPORT ITU-R [IMT-2020 TECH PERF REQ] – Minimum requirements related to technical performance for IMT-2020 radio interface(s),” at the following link:

https://www.itu.int/md/R15-SG05-C-0040/en

In the ITU standardization process diagram, above, you can see that their final standardization documents will not be available until 2019 – 2020.

2. Industry 5G activities

Meanwhile, the wireless telecommunications industry isn’t waiting for the ITU to finalize IMT-2020 before developing and testing 5G technologies and making initial 5G deployments.

3rd Generation Partnership Project (3GPP)

In February 2017, the organization 5G Americas summarized the work by 3GPP as follows:

“As the name implies the IMT-2020 process is targeted to define requirements, accept technology proposals, evaluate the proposals and certify those that meet the IMT-2020 requirements, all by the 2020 timeframe. This however, requires that 3GPP start now on discussing technologies and system architectures that will be needed to meet the IMT-2020 requirements. 3GPP has done just that by defining a two phased 5G work program starting with study items in Rel-14 followed by two releases of normative specs spanning Rel-15 and Rel-16 with the goal being that Rel-16 includes everything needed to meet IMT-2020 requirements and that it will be completed in time for submission to the IMT-2020 process for certification.”

The 2016 3GPP timeline for development of technologies and system architectures for 5G is shown below.

3GGP roadmap 2016

Source: 3GPP / 5G Americas White Paper

Details are presented in the 3GPP / 5G Americas white paper, “Wireless Technology Evolution Towards 5G: 3GPP Releases 13 to Release 15 and Beyond,” which you can download at the following link:

http://www.5gamericas.org/files/6814/8718/2308/3GPP_Rel_13_15_Final_to_Upload_2.14.17_AB.pdf

Additional details are in a February 2017 3GPP presentation, “Status and Progress on Mobile Critical Communications Standards,” which you can download here:

http://www.3gpp.org/ftp/Information/presentations/Presentations_2017/CCE-2017-3GPP-06.pdf

In this presentation, you’ll find the following diagram that illustrates the many functional components that will be part of 5G service. The “Future IMT” in the pyramid below is the ITU’s IMT-2020.

ITU 5G functions

Source: 3GPP presentation

AT&T and Verizon plan initial deployments of 5G technology

In November 2016, AT&T and Verizon indicated that their initial deployment of 5G technologies would be in fixed wireless broadband services. In this deployment concept, a 5G wireless cell would replace IEEE 802.11 wireless or wired routers in a small coverage area (i.e., a home or office) and connect to a wired / fiber terrestrial broadband system. Verizon CEO Lowell McAdam referred to this deployment concept as “wireless fiber.” You’ll find more information on these initial 5G deployment plans in the article, “Verizon and AT&T Prepare to Bring 5G to Market,” on the IEEE Spectrum website at the following link:

http://spectrum.ieee.org/telecom/wireless/verizon-and-att-prepare-to-bring-5g-to-market

Under Verizon’s current wireless network densification efforts, additional 4G nodes are being added to better support high-traffic areas. These nodes are closely spaced (likely 500 – 1,000 meters apart) and also may be able to support early demonstrations of a commercial 5G system.

Verizon officials previously has talked about an initial launch of 5G service in 2017, but also have cautioned investors that this may not occur until 2018.

DARPA Spectrum Collaboration Challenge 2 (SC2)

In my 6 June 2016 post, I reported on SC2, which eventually could benefit 5G service by:

“…developing a new wireless paradigm of collaborative, local, real-time decision-making where radio networks will autonomously collaborate and reason about how to share the RF (radio frequency) spectrum.”

If SC2 is successful and can be implemented commercially, it would enable more efficient use of the RF bandwidth assigned for use by 5G systems.

3. Conclusion

Verizon’s and AT&T’s plans for early deployment of a subset of 5G capabilities are symptomatic of an industry in which the individual players are trying hard to position themselves for a future commercial advantage as 5G moves into the mainstream of wireless communications. This commercial momentum is outpacing ITU’s schedule for completing IMT-2020. The recently released draft technical performance requirements provide a more concrete (interim) definition of 5G that should remove some uncertainty for the industry.

 

 

 

10th Anniversary of the iPhone

On 9 January 2007, Steve Jobs introduced the iPhone at Macworld in San Francisco, and the smart phone revolution moved into high gear.

Steve Jobs introduces iPhoneSteve Jobs introduces the iPhone in 2007.  Source: Apple

 Fortunately the first iPhone image he showed during this product introduction was a joke.

Not the iPhone

Not the 2007 original iPhone. Source: Apple

In the product introduction, Steve Jobs described the iPhone as:

  • Widescreen iPod with touch controls
  • Revolutionary mobile phone
  • Breakthrough internet communications device

You can watch a short (10 minute) video of the historic iPhone product introduction at the following link:

https://www.youtube.com/watch?v=MnrJzXM7a6o

A longer version (51 minutes) with technical details about the original iPhone is at the following link:

https://www.youtube.com/watch?v=vN4U5FqrOdQ

These videos are good reminders of the scope of the innovations in the original iPhone.

iPhone introduction2007 ad for the original iPhone. Source: web.archive.org

The original iPhone was a 2G device. The original applications included IMAP/POP e-mail, SMS messaging, iTunes, Google Maps, Photos, Calendar and Widgets (weather and stocks). The Apple App Store did not yet exist.

iTunes and the App Store are two factors that contributed to the great success of the iPhone. The iPhone App Store opened on 10 July 2008, via an update to iTunes. The App Store allowed Apple to completely control third-party apps for the first time. On 11 July 2008, the iPhone 3G was launched and came pre-loaded with iOS 2.0.1 with App Store support. Now users could personalize the capabilities of their iPhones in a way that was not available from other mobile phone suppliers.

You’ll find a good visual history of the 10-year evolution of the iPhone on The Verge website at the following link:

http://www.theverge.com/2017/1/9/14211558/iphone-10-year-anniversary-in-pictures

What mobile phone were you using 10 years ago? I had a Blackberry, which was fine for basic e-mail, terrible for internet access / browsing, and useless for applications. From today’s perspective, 10 years ago, the world was in the Dark Ages of mobile communications. With 5G mobile communications coming soon, it will be interesting to see how our perspective changes just a few years from now.

 

The First Digital Camera Started a Revolution in Photography and Much More

In 1975, I was shooting photographs with a Nikon F2 single lens reflex (SLR) film camera I bought two years before. The F2 was introduced by Nikon in September 1971, and was still Nikon’s top-of-the-line SLR when I bought it in 1973. I shot slide film because I liked the quality of the large projected images. I was quite happy with my Kodak Carousel slide projector and circular slide trays, even though the trays took up a lot of storage space. Getting print copies of slides for family and friends took time and money, but I was used to that. Little did I suspect, at the time, that a revolution was brewing at Eastman Kodak.

The first digital camera prototype: 1975

In 1975, Steve Sasson invented the digital camera while working at Kodak. This first digital camera weighed 8 pounds (3.6kg), was capable of taking 0.01 megapixel (10,000 pixel) black & white photos, and storing 30 photos on a removable digital magnetic tape cassette. An image captured by the camera’s 100 x 100 pixel Fairchild CCD (charge coupled device) sensor was stored in RAM (random access memory) in about 50 milliseconds (ms). Then it took 23 seconds to record one image to the digital cassette tape. For the first time, photos were captured on portable digital media, which made it easy to rapidly move the image files into other digital systems.

Sasson holding first digital cameraSteve Sasson & the first digital camera. Source: MagaPixel

David Friedman, who has interviewed many contemporary inventors, interviewed Steve Sasson in 2011. I think you’ll enjoy his short video interview, which reveals details of how the first digital camera was designed and built, at the following link:

https://vimeo.com/22180298

Arrival of consumer digital cameras: 1994

In February 1994, almost 20 years after Steve Sasson’s first digital camera, Apple introduced the Kodak-manufactured QuickTake 100, which was the first mass market color consumer digital camera available for under $1,000.

Apple Quicktake 100   Apple QuickTake 100. Source: Apple

 The QuickTake 100 could take digital photos at either 0.3 megapixels (high-resolution) or 0.08 megapixels (standard-resolution), and store the image files on a internal (not removable) 1MB flash EPROM (erasable programmable read-only memory). The EPROM could store 32 standard or eight high-resolution images, or a combination. Once downloaded, these modest-resolution images were adequate for many applications requiring small images, such as pasting a photo into an electronic document.

In the following years before the millennium, the consumer and professional photography markets were flooded with a vast array of rapidly improving digital cameras and much lower prices for entry-level models. While my old Nikon F2 film camera remained a top-of-the-line camera for many years back in the 1970s, many newly introduced digital cameras were obsolete by the time they were available in the marketplace.

For a comprehensive overview of the evolution of digital photography, I refer you to Roger L. Carter’s DigiCamHistory website, which contains an extensive history of film and digital photography from the 1880s thru 1999.

http://www.digicamhistory.com/Index.html

Film cameras are dead – well almost. On 22 June 2009, Kodak announced that it would cease selling Kodachrome film by the end of 2009. Except for continuing production of professional film for movies, Kodak exited the film business after 74 years. FujiFilm and several other manufacturers continue to offer a range of print and slide film. You can read an assessment of the current state of the film photography industry at the following link:

http://www.thephoblographer.com/2015/04/23/manufacturers-talk-state-film-photography-industry/#.V73QqWWVtbc

Arrival of camera phones: 2000

In the new millennium, we were introduced to a novel new type of camera, the camera phone, which was first introduced in Japan in 2000. There seems to be some disagreement as to which was the first camera phone. The leading contenders are:

  • Samsung SCH-V200, which could take 0.35 megapixel photos and store them on an internal memory device
  • Sharp (J-Phone) J-SH04, which could take 0.11 megapixel photos and send them electronically

At that time, small point-and-shoot digital cameras typically were taking much better photos in the 0.78 – 1.92 megapixel range (1024 x 768 pixels to 1600 x 1200 pixels), with high-end digital SLRs taking 10 megapixel photos (3888 x 2592 pixels).

In November 2002, Sprint became the first service provider in the U.S. to offer a camera phone, the Sanyo SCP-5300, which could take 0.3 megapixel (640 x 480 pixels) photos and included many features found on dedicated digital cameras.

Sanyo SCP-5300

 

Sanyo SCP-5300. Source: Sprint

In late 2003, the Ericsson Z1010 mobile phone introduced the front-facing camera, which enabled the user to conveniently take a “selfie” photo or video while previewing the image on the phone’s video display. Narcissists around the world rejoiced! A decade later they rejoiced again following the invention of the now ubiquitous, and annoying “selfie stick”.

Ericsson Z1010

 

Ericsson Z1010. Source: www.GSMArena.com

You’ll find more details on the history of the camera phone at the following link:

http://www.digitaltrends.com/mobile/camera-phone-history/

Arrival of smartphones:

The 1993 IBM Simon is generally considered to be the first “smart phone.” It could function as a phone, pager, and PDA (personal desktop assistant) , with simple applications for calendar, calculator, and address book, but no built-in camera. The important feature of the smart phone was its ability to run various applications to expand its functionality.

The first mobile phone actually referred to as a “smartphone” was Erikkson’s 1997 model GS88 concept phone, which led in 2000 to the Erikkson model R380. This was the first mobile phone marketed as a smartphone…..but it had no camera.

With the introduction of camera phones and smartphones in 2000, and front-facing cameras in 2003, it wasn’t long before the most popular mobile phones were smartphones with two cameras. Now, just 13 years after this convergence of technology, it seems that smartphones are everywhere and these devices have evolved into very capable tools for high-resolution still and video photography as well as photo processing and video editing using specialized applications that can be installed by the user.

With these capabilities available in a small, integrated mobile device, it’s no wonder that the sale of dedicated digital cameras has been declining rapidly.

Impact of mobile phone cameras on dedicated camera sales

Here is a comparison of the digital image sensors on three representative modern cameras:

  • Nikon D800 DSLR camera: 36 megapixels (7360 × 4912 pixels), FX full-frame (35.9 x 24.0 mm, 43.18 mm diagonal) CMOS image sensor
  • Sony DSC-HX90V compact point-and-shoot camera: 18.2 megapixels (4896 x 3672 pixels), 1/2.3 type (6.17mm x 4.55mm, 7.67 mm diagonal) CMOS image sensor
  • Apple iPhone 6 cameras: Main camera: 8 megapixels (3264 x 2448 pixels), 1/2.94 type (4.8mm x 3.6mm, 6.12 mm diagonal) CMOS, Sony Exmor RS image sensor. Front-facing camera: 1.2 megapixels

The Nikon’s FX sensor is as big as a the photo’s image would be in a 35 mm film camera. This is called a “full frame” sensor. Most digital cameras have smaller image sensors, as shown in the following comparison chart.

Comparison of digital image sensor sizesSource: www.techhive.com

The IPhone 6 image sensor is smaller than any shown in the above chart. Nonetheless, its photo and video quality can be quite good.

For more information on digital camera image sensors, check out the 2013 article by Jackie Dove, “Demystifying digital camera sensors once and for all,” at the following link:

http://www.techhive.com/article/2052159/demystifying-digital-camera-sensors-once-and-for-all.html

The rapid rise in the quality of mobile phone cameras is making small digital cameras redundant, and is having a dramatic impact on the sale of dedicated cameras, as shown in the following chart.

Screen Shot 2016-08-24 at 1.24.20 PM

Source: Mayflower Concepts, petapixel.com

The above chart indicates that only 40,000 dedicated cameras of all types were sold in 2014; far below the peak of about 120,000 units in 2010. The biggest impact has been on compact digital cameras, with the DSLR cameras holding their own, at least for the moment.

While I still like my current Nikon DSLR, I have to admit that I’ve found some higher-end compact digital cameras that have most of the capabilities I want in an SLR but in a much smaller package. While I won’t make my mobile phone camera my primary camera, I may retire the DSLR.

Immediate communications and privacy

The rapid rise of the smartphone was enabled by the deployment of 3G and 4G cellular phone service. See my 20 March 2016 post on the evolution of cellular service for details on the deployment timeline.

With access to capable wireless communications networks and a host of photo and video applications and services, the cameras on mobile phones became tools for capturing images or videos of anything and instantly communicating these via the Internet to audiences that can span the globe. We’re now living in a world where many awkward moments get recorded, meals get photographed before they’re eaten, and there’s a need to post a selfie during an event to prove that you actually were there (and of course, to impress your friends). Thanks to the advent of the cloud, all of these digital photographic memories can be preserved online forever, or at least until you don’t want to continue paying for cloud storage.

Privacy is becoming a thing of the past. What happens in Vegas probably gets photographed by someone and, if you’re lucky, stays in the cloud…..until it’s needed, or hacked.

I don’t think Steve Sasson imagined such a future when he invented the first digital camera in 1975.

New DARPA Grand Challenge: Spectrum Collaboration Challenge (SC2)

On 23 March 2016, the Defense Advanced Projects Research Agency (DARPA) announced the SC2 Grand Challenge in Las Vegas at the International Wireless Communications Expo (IWCE). DARPA described this new Grand Challenge as follows:

“The primary goal of SC2 is to imbue radios with advanced machine-learning capabilities so they can collectively develop strategies that optimize use of the wireless spectrum in ways not possible with today’s intrinsically inefficient approach of pre-allocating exclusive access to designated frequencies. The challenge is expected to both take advantage of recent significant progress in the fields of artificial intelligence and machine learning and also spur new developments in those research domains, with potential applications in other fields where collaborative decision-making is critical.”

You can read the DARPA press release on the SC2 Grand Challenge at the following link:

http://www.darpa.mil/news-events/2016-03-23

SC2 is a response to the rapid growth in demand for wireless spectrum by both U.S. military and civilian users.  A DARPA representative stated, “The current practice of assigning fixed frequencies for various uses irrespective of actual, moment-to-moment demand is simply too inefficient to keep up with actual demand and threatens to undermine wireless reliability.”  The complexity of the current radio frequency allocation in the U.S. can be seen in the following chart.

image

Chart Source: U.S. Department of Commerce, National Telecommunications and Infrastructure Administration

You can download a high-resolution PDF copy of the above U.S. frequency spectrum chart at the following link:

https://www.ntia.doc.gov/files/ntia/publications/2003-allochrt.pdf

15 July 2016 Update: FCC allocates frequency spectrum to facilitate deploying 5G wireless technologies in the U.S.

On 14 July 2016, the Federal Communications Commission (FCC) announced:

“Today, the FCC adopted rules to identify and open up the high frequency airwaves known as millimeter wave spectrum. Building on a tried-and-true approach to spectrum policy that enabled the explosion of 4G (LTE), the rules set in motion the United States’ rapid advancement to next-generation 5G networks and technologies.

The new rules open up almost 11 GHz of spectrum for flexible use wireless broadband – 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum. With the adoption of these rules, the U.S. is the first country in the world to open high-band spectrum for 5G networks and technologies, creating a runway for U.S. companies to launch the technologies that will harness 5G’s fiber-fast capabilities.”

You can download an FCC fact sheet on this decision at the following link:

https://www.fcc.gov/document/rules-facilitate-next-generation-wireless-technologies

These new rules change the above frequency allocation chart by introducing terrestrial 5G systems into high frequency bands that historically have been used primarily by satellite communication systems.

5G is Coming, Slowly

The 5th generation of mobile telephony and data services, or 5G, soon will be arriving at a cellular service provider near each of us, but probably not this year. To put 5G in perspective, let’s start with a bit of telecommunications history.

1. Short History of Mobile Telephony and Data Services

0G non-cellular radio-telephone service:

  • 1946 – Mobile Telephone Service (MTS): This pre-cellular, operator assisted, mobile radio-telephone service required a full duplex VHF radio transceiver in the mobile user’s vehicle to link the mobile user’s phone to the carrier’s base station that connected the call to the public switched telephone network (PSTN) and gave access to the land line network. Each call was allocated to a specific frequency in the radio spectrum allocated for radio-telephone use. This type of access is called frequency division multiple access (FDMA). When the Bell System introduced MTS in 1946 in St. Louis, only three channels were available, later increasing to 32 channels.
  • 1964 – Improved Mobile Telephone Service (IMTS): This service provided full duplex UHF/VHF communications between a radio transceiver (typically rated at 25 watts) in the mobile user’s vehicle and a base station that covered an area 40 – 60 miles (64.3 – 96.6 km) in diameter. Each call was allocated to a specific frequency. The base station connected the call to the PSTN, which gave access to the land line network.

1G analog cellular phone service:

  • 1983 – Advanced Mobile Phone System (AMPS): This was the original U.S. fully automated, wireless, portable, cellular standard developed by Bell Labs. AMPS operated in the 800 MHz band and supported phone calls, but not data. The control link between the cellular phone and the cell site was a digital signal. The voice signal was analog. Motorola’s first cellular phone, DynaTAC, operated on the AMPS network.
    •  AMPS used FDMA, so each user call was assigned to a discrete frequency for the duration of the call. FDMA resulted in inefficient use of the carrier’s allocated frequency spectrum.
    • In Europe the comparable 1G standards were TACS (Total Access Communications System, based on AMPS) and NMT (Nordic Mobile Telephone).
    • The designation “1G” was retroactively assigned to analog cellular services after 2G digital cellular service was introduced. As of 18 February 2008, U.S. carriers were no longer required to support AMPS.

2G digital cellular phone and data services:

  • 1991 – GSM (Global System for Mobile), launched in Finland, was the first digital wireless standard. 2G supports digital phone calls, SMS (Short Message Service) text messaging, and MMS (Multi-Media Message). 2G networks typically provide data speeds ranging from 9.6 kbits/s to 28.8 kbits/s. This relatively slow data speed is too slow to provide useful Internet access in most cases. Phone, messages and the control link between the cellular phone and the cell site all are digital signals.
    •  GSM operates on the 900 and 1,800 MHz bands using TDMA (time division multiple access) to manage up to 8 users per frequency channel. Each user’s digital signal is parsed into discrete time slots on the assigned frequency and then reassembled for delivery.
    • Today GSM is used in about 80% of all 2G devices
  • 1995 – Another important 2G standard is Interim Standard 95 (IS-95), which was the first code division multiple access (CDMA) standard for digital cellular technology. IS-95 was developed by Qualcomm and adopted by the U.S. Telecommunications Industry Association in 1995. In a CDMA network, each user’s digital signal is parsed into discrete coded packets that are transmitted and then reassembled for delivery. For a similar frequency bandwidth, a CDMA cellular network can handle more users than a TDMA network.
  • 2003 – EDGE (Enhanced Data Rates for GSM Evolution) is a backwards compatible evolutionary development of the basic 2G GSM system. EDGE generally is considered to be a pre-3G cellular technology. It uses existing GSM spectra and TDMA access and is capable of improving network capacity by a factor of about three.
  •  In the U.S., some cellular service providers plan to terminate 2G service by the end of 2016.

3G digital cellular phone and data services:

  • 1998 – There are two main 3G standards for cellular data: IMT-2000 and CDMA2000. All 3G networks deliver higher data speeds of at least 200 kbits/s and lower latency (the amount of time it takes for the network to respond to a user command) than 2G. High Speed Packet Access (HSPA) technology enables even higher 3G data speeds, up to 3.6 Mbits/s. This enables a very usable mobile Internet experience with applications such as global positioning system (GPS) navigation, location-based services, video conferencing, and streaming mobile TV and on-demand video.
    •  IMT (International Mobile Telecommunications) -2000 accommodates three different access technologies: FDMA, TDMA and CDMA. Its principal implementations in Europe, Japan, Australia and New Zealand use wideband CDMA (W-CDMA) and is commonly known as the Universal Mobile Telecommunications System (UMTS). Service providers must install almost all new equipment to deliver UMTS 3G service. W-CDMA requires a larger available frequency spectrum than CDMA.
    • CDMA2000 is an evolutionary development of the 2G CDMA standard IS-95. It is backwards compatible with IS-95 and uses the same frequency allocation. CDMA2000 cellular networks are deployed primarily in the U.S. and South Korea.
  • 3.5G enhances performance further, bringing cellular Internet performance to the level of low-end broadband Internet. With peak data speeds of about 7.2 Mbits/sec.

4G digital cellular phone and data services:

  • 2008 – IMT Advanced: This standard, adopted by the International Telecommunications Union (ITU), defines basic features of 4G networks, including all-IP (internet protocol) based mobile broadband, interoperability with existing wireless standards, and a nominal data rate of 100 Mbit/s while the user is moving at high speed relative to the station (i.e., in a vehicle).
  • 2009 – LTE (Long Term Evolution): The primary standard in use today is known as 4G LTE, which first went operational in Oslo and Stockholm in December 2009. Today, all four of the major U.S. cellular carriers offer LTE service.
    • In general, 4G LTE offers full IP services, with a faster broadband connection with lower latency compared to previous generations. The peak data speed typically is 1Gbps, which translates to between 1Mbps and 10Mbps for the end user.
    • There are different ways to implement LTE. Most 4G networks operate in the 700 to 800 MHz range of the spectrum, with some 4G LTE networks operating at 3.5 GHz.

2. The Hype About 5G

The goal of 5G is to deliver a superior wireless experience with speeds of 10Mbps to 100Mbps and higher, with lower latency, and lower power consumption than 4G. Some claim that 5G has the potential to offer speeds up to 40 times faster than today’s 4G LTE networks. In addition, 5G is expected to reduce latency to under a millisecond, which is comparable to the latency performance of today’s high-end broadband service.

With this improved performance, 5G will enable more powerful services on mobile devices, including:

  • Rapid downloads / uploads of large files; fast enough to stream “8K” video in 3-D. This would allow a person with a 5G smartphone to download a movie in about 6 seconds that would take 6 minutes on a 4G network.
  • Enable deployment of a wider range of IoT (Internet of Things) devices on networks where everything is connected to everything else and IoT devices are communicating in real-time. These devices include “smart home” devices and longer-lasting wearable devices, both of which benefit from 5G’s lower power consumption and low latency.
  • Provide better support for self-driving cars, each of which is a complex IoT node that needs to communicate in real time with external resources for many functions, including navigation, regional (beyond the range of the car’s own sensors) situation awareness, and requests for emergency assistance.
  • Provide better support for augmented reality / virtual reality and mobile real-time gaming, both of which benefit from 5G’s speed and low latency

3. So what’s the holdup?

5G standards have not yet been yet and published. The ITU’s international standard is expected to be known as IMT-2020. Currently, the term “5G” doesn’t signify any particular technology.

5G development is focusing on use of super-high frequencies, as high as 73 GHz. Higher frequencies enable faster data rates and lower latency. However, at the higher frequencies, the 5G signals are usable over much shorter distances than 4G, and the 5G signals are more strongly attenuated by walls and other structures. This means that 5G service will require deployment of a new network architecture and physical infrastructure with cell sizes that are much smaller than 4G. Cellular base stations will be needed at intervals of perhaps every 100 – 200 meters (328 to 656 feet). In addition, “mini-cells” will be needed inside buildings and maybe even individual rooms.

Fortunately, higher frequencies allow use of smaller antennae, so we should have more compact cellular hardware for deploying the “small cell” architecture. Get ready for new cellular nomenclature, including “microcells”, “femtocells” and “picocells”.

Because of these infrastructure requirements, deployment of 5G will require a significant investment and most likely will be introduced first in densely populated cities.

Initial introduction date is unlikely to be before 2017.

More details on 5G are available in a December 2014 white paper by GMSA Intelligence entitled, “Understanding 5G: Perspectives on Future Technological Advances in Mobile,” which you can download at the following link:

https://gsmaintelligence.com/research/?file=141208-5g.pdf&download

Note that 5G’s limitations inside buildings and the need for “mini-cells” to provide interior network coverage sound very similar to the limitations for deploying Li-Fi, which uses light instead of radio frequencies for network communications. See my 12 December 2015 post for information of Li-Fi technology.

15 July 2016 Update: FCC allocates frequency spectrum to facilitate deploying 5G wireless technologies in the U.S.

On 14 July 2016, the Federal Communications Commission (FCC) announced:

“Today, the FCC adopted rules to identify and open up the high frequency airwaves known as millimeter wave spectrum. Building on a tried-and-true approach to spectrum policy that enabled the explosion of 4G (LTE), the rules set in motion the United States’ rapid advancement to next-generation 5G networks and technologies.

The new rules open up almost 11 GHz of spectrum for flexible use wireless broadband – 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum. With the adoption of these rules, the U.S. is the first country in the world to open high-band spectrum for 5G networks and technologies, creating a runway for U.S. companies to launch the technologies that will harness 5G’s fiber-fast capabilities.”

You can download an FCC fact sheet on this decision at the following link:

https://www.fcc.gov/document/rules-facilitate-next-generation-wireless-technologies

These new rules should hasten the capital investments needed for timely 5G deployments in the U.S.