Tag: "technical"

Posted November 3, 2015 by christopher

Chattanooga returns to the Community Broadband Bits podcast this week in episode 175 to talk about their 10 Gbps upgrade, the fibervention campaign, TN4Fiber, and having surpassed 75,000 subscribers.

For so much content, we have three guests joining us from Chattanooga's Electric Power Board (the EPB in EPB Fiber): Danna Bailey is the VP of Corporate Communications, Beth Johnson is the Marketing Manager, and Colman Keane is the Director of Fiber Technology.

Danna gives some background on what they are doing in Chattanooga and how excited people in nearby communities are for Chattanooga to bring local Internet choice to SE Tennessee if the state would stop protecting the AT&T, Comcast, and Charter monopolies from competition.

Beth tells us about the Fibervention campaign and how excited people are once they experience the full fiber optic experience powered by a locally-based provider.

And finally, Colman talks tech with us regarding the 10 Gbps platform, branded NextNet. We tried to get a bit more technical for the folks that are very curious about these cutting edge technologies on a passive optical network.

Read the transcript from episode 175 here.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below.

This show is 25 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to other episodes here or view all episodes in our index. You can can downlhttp://muninetworks.org/sites/www.muninetworks.org/files/audio/comm-bb-bits-podcast175-danna-bailey-colman-keane-beth-johnson-epb.mp3oad this Mp3 file directly from here.

Thanks to Arne Huseby for the music, licensed using Creative Commons. The song is "Warm Duck Shuffle."

Posted May 12, 2015 by christopher

This week, we are focusing on security concerns for community networks. Community networks tend to be small, with limited staff to keep track of the many challenges of running a network. Our guest this week on Community Broadband Bits is Weston Hecker, Senior Systems Security Analyst and Pen Tester for KLJ.

After defining what penetration testing is, we discuss why small ISPs need to be concerned with security and some of the options available to them.

You can also see Weston talking at DEFCON about burner phones and DDOS attacks.

This is a more technical show than we ordinarily produce. For those who are confused but interested in learning more about security or just the nuts and bolts of how the Internet works, a good weekly show on this topic is Security Now. Look back in the archives for a wealth of relevant topics.

Read the transcript of this discussion here.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below.

This show is 16 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to previous episodes here. You can can download this Mp3 file directly from here.

Find more episodes in our podcast index.

Thanks to Persson for the music, licensed using Creative Commons. The song is "Blues walk."

Posted July 8, 2014 by christopher

Longmont is about to break ground on the citywide FTTH gigabit network but it is already offering services to local businesses and a few neighborhoods that started as pilot projects. Vince Jordan, previously a guest two years ago, is back to update us on their progress.

Until recently, Vince was the Telecom Manager for Longmont Power and Communications in Colorado. He has decided to return to his entrepreneurial roots now that the utility is moving forward with the citywide project. But he has such a great voice and presence that we wanted to bring him back to share some stories.

We talk about Longmont's progress and how they dealt with a miscalculation in costs that forced them to slightly modify prices for local businesses shortly after launching the service. And finally, we discuss the $50/month gigabit service and how Longmont has been able to drive the price so low.

You can read our full coverage of Longmont from this tag.

Read the transcript from our discussion here.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below. Also, feel free to suggest other guests, topics, or questions you want us to address.

This show is 20 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to previous episodes here. You can can download this Mp3 file directly from here.

Find more episodes in our podcast index.

Thanks to Waylon Thornton for the music, licensed using Creative Commons. The song is "Bronco Romp."

Posted July 2, 2014 by tanderson

This is the first in two-part series on spectrum basics and how we could better manage the spectrum to encourage innovation and prevent either large corporations or government from interfering with our right to communicate. Part 2 is available here.

We often think of all our wireless communications as traveling separate on paths: television, radio, Wi-Fi, cell phone calls, etc. In fact, these signals are all part of the same continuous electromagnetic spectrum. Different parts of the spectrum have different properties, to be sure - you can see visible light, but not radio waves. But these differences are more a question of degree than a fundamental difference in makeup. 

As radio, TV, and other technologies were developed and popularized throughout the 20th century, interference became a major concern. Any two signals using the same band of the spectrum in the same broadcast range would prevent both from being received, which you have likely experienced on your car radio when driving between stations on close frequencies – news and music vying with each other, both alternating with static. 

To mitigate the problem, the federal government did what any Econ 101 textbook says you should when you have a “tragedy of the commons” situation in which more people using a resource degrades it for everyone: they assigned property rights. This is why radio stations tend not to interfere with each other now.

The Federal Communications Commission granted exclusive licenses to the spectrum in slices known as bands to radio, TV, and eventually telecom companies, ensuring that they were the only ones with the legal right to broadcast on a given frequency range within a certain geographic area. Large bands were reserved for military use as well.

Originally, these licenses came free of charge, on the condition that broadcasters meet certain public interest requirements. Beginning in 1993, the government began to run an auction process, allowing companies to bid on spectrum licenses. That practice continues today whenever any space on the spectrum is freed up. (For a more complete explanation of the evolution of licensing see this excellent Benton foundation blog post.)

Although there have been several redistributions over the decades, the basic architecture remains....

Read more
Posted May 6, 2014 by christopher

The Open Technology Institute at the New America Foundation, along with ctc Technology and Energy, have released an overview of options for local governments that want to improve Internet access. The report is titled, "The Art of the Possible: An Overview of Public Broadband Options."

The paper has been released at an opportune time, more communities are now considering what investments they can make at the local level than ever. The Art of the Possible offers different models, from muni ownership and partnerships to coops. The paper examines different business models and assesses the risk of various approaches.

It also includes a technical section for the non-technical to explain the differences between different types of broadband technology.

From the introduction:

The one thing communities cannot do is sit on the sidelines. Even the process of evaluating whether a public network is appropriate can be beneficial to community leaders as a means to better understand the communications needs of their residents, businesses, and institutions and whether existing services and networks are keeping pace.

The purpose of this report is to enable communities to begin the evaluation of their broadband options. The report begins with an overview of different network ownership and governance models, followed by an overview of broadband technologies to help potential stakeholders understand the advantages and disadvantages of each technology. It then provides a brief summary of several different business models for publicly owned networks. The final two chapters focus on the potential larger local benefits and the risks of a publicly funded broadband project.

Posted April 29, 2014 by christopher

This week we are welcoming Scott Bradner, a long time doer, writer, and thinker on Internet matters. Thanks to a listener request, we had already recorded an interview last week discussing peering before the news broke that the FCC would be allowing paid prioritization peering arrangements, which many have said represents the end of network neutrality. We talked prior to the announcement of the FCC's upcoming rules so we do not discuss them directly.

We explain what "peering" is and why it is essential to the Internet. It gets a little technical but we try to bring it back with simple examples.

Our take on the Comcast-Netflix deal may surprise some listeners because the arrangement is not as far from the tradition of paid interconnection arrangement as some strong supporters of network neutrality maintain. However, we are explicit in noting that monopoly providers like Comcast may abuse their market power to shake down companies like Netflix. That is worrisome but may best be dealt with using other means aside from changing the way peering has historically worked.

We end the show discussing the consolidation of ISPs and the role of symmetry in peering.

Scott recommended these two columns and I strongly encourage readers/listeners to read Barbara van Schewick's post on the subject.

Read the transcript from this discussion here.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below. Also, feel free to suggest other guests, topics, or questions you want us to address.

This show is 20 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to previous episodes here. You can can...

Read more
Posted January 9, 2014 by christopher

This the second in a series of posts exploring lessons learned from the Seattle Gigabit Squared project, which now appears unlikely to be built. The first post is available here and focuses on the benefits massive cable companies already have as well as the limits of conduit and fiber in spurring new competition.

This post focuses on business challenges an entity like Gigabit Squared would face in building the network it envisioned. I am not representing that this is what Gigabit Squared faced but these issues arise with any new provider in that circumstance. I aim to explain why the private sector has not and generally will not provide competition to companies Comcast and Time Warner Cable.

Gigabit Squared planned to deliver voice, television, and Internet access to subscribers. Voice can be a bit of hassle due to the many regulatory requirements and Internet access is comparatively simple. But television, that is a headache. I've been told by some munis that 90% of the problems and difficulties they experience is with television services.

Before you can deliver ESPN, the Family Channel, or Comedy Central, you have to come to agreement with big channel owners like Disney, Viacom, and others. Even massive companies like Comcast have to pay the channel owners more each year despite its over 10 million subscribers, so you can imagine how difficult it can be for a small firm to negotiate these contracts. Some channel owners may only negotiate with a provider after it has a few thousand subscribers - but getting a few thousand subscribers without good content is a challenge.

Many small firms (including most munis) join a buyer cooperative called the National Cable Television Cooperative (NCTC) that has many of the contracts available. But even with that substantial help, building a channel lineup is incredibly difficult and the new competitor will almost certainly be paying more for the same channels as a competitor like Comcast or Time Warner Cable. And some munis, like Lafayette, faced steep barriers in just joining the coop.

FCC Logo

(An...

Read more
Posted January 6, 2014 by christopher

A few weeks ago, a Geekwire interview with outgoing Seattle Mayor Mike McGinn announced that the Gigabit Squared project there was in jeopardy. Gigabit Squared has had difficulty raising all the necessary capital for its project, building Fiber-to-the-Home to several neighborhoods in part by using City owned fiber to reduce the cost of building its trunk lines.

There are a number of important lessons, none of them new, that we should take away from this disappointing news. This is the first of a series of posts on the subject.

But first, some facts. Gigabit Squared is continuing to work on projects in Chicago and Gainsville, Florida. There has been a shake-up at the company among founders and it is not clear what it will do next. Gigabit Squared was not the only vendor responding to Seattle's RFP, just the highest profile one.

Gigabit Squared hoped to raise some $20 million for its Seattle project (for which the website is still live). The original announcement suggested twelve neighborhoods with at least 50,000 households and businesses would be connected. The project is not officially dead, but few have high hopes for it given the change in mayor and many challenges thus far.

The first lesson to draw from this is what we say repeatedly: the broadband market is seriously broken and there is no panacea to fix it. The big cable firms, while beating up on DSL, refuse to compete with each other. They are protected by a moat made up of advantages over potential competitors that includes vast economies of scale allowing them to pay less for advertising, content, and equipment; large existing networks already amortized; vast capacity for predatory pricing by cross-subsidizing from non-competitive areas; and much more.

So if you are an investor with $20 million in cash lying around, why would you ever want to bet against Comcast - especially by investing in an unknown entity that cannot withstand a multi-year price war? You wouldn't and they generally don't. The private sector invests for a return and overbuilding Comcast with fiber almost...

Read more
Posted December 24, 2012 by christopher

If you have been trying to find a book that offers an engaging explanation of how the Internet physically works and the various networks interconnect, search no more. Tubes: A Journey to the Center of the Internet by Andrew Blum has done it.

The author was featured on Fresh Air way back in May, but not much has changed with Internet infrastructure since then.

In Tubes, journalist Andrew Blum goes on a journey inside the Internet's physical infrastructure to uncover the buildings and compounds where our data is stored and transmitted. Along the way, he documents the spaces where the Internet first started, and the people who've been working to make the Web what it is today.

He was also just on C-Span's "The Communicators."

I enjoyed the read and learned a few things along the way. Those looking for a dry, just-the-facts-ma'am approach may not enjoy the frequent musings of Blum on his experiences. But I did.

One of his trips took him to a community in Oregon called The Dalles, where a municipal network allowed Google to build its very first "built-from-scratch data center." More on that in a post to come soon...

Those who are doing their reading on tablets now will be interested to know that the eBook is temporarily priced at $1.99. The deal lasts until New Years according to the author.

Posted December 1, 2012 by lgonzalez

Why can Wi-Fi be so great in some places but so awful in others? (Ahem... Hotels.) Time to stop imagining Wi-Fi as magic and instead think of it just as a means of taking one connection and sharing it among many people without wires.

If you take a ho-hum connection and share it with 10 people, it becomes a bad connection. On the other hand, if you take a great connection and share it with those same 10 people, they will be very happy surfers. As Benoit Felten recently told us, the best wireless networks have been built in cities with the best wired infrastructure.

Wi-Fi still has hiccups but they are well worth it for convenience, mobility, and ubiquity. In a recent GigaOm article, Stacey Higgenbotham detailed additional reasons that not all Wi-fi is created equal.

Higgenbotham lays out the limitations of Wi-fi and breaks down the technology's touchiness into four main factors. She addresses the issue specifically for travelers. From the article:

Backhaul: For most Wi-Fi is their access to the internet, but it’s actually just a radio technology that moves information over the air.

Density:…the more people you add to a network — even if those people are just checking their Facebook page — the worse the network will perform.

Movement: Wi-Fi connectivity is designed for fixed access, meaning the radios stay put...when you try to jump from hot spot to hot spot problems occur.
...
Device: Newer phones and tablets are supporting a dual-mode Wi-Fi radio, which means they can hop from the 2.4 Ghz band to the 5 GHz band….now with phones like the latest iPhone that have dual-band support, you may hop to another band only to find a bunch of other users.

Higgenbotham doesn't have the answers on how to improve service, but she offers practical advice:

Given these constraints, it hopefully makes a little more sense when you can’t download a movie while on Amtrak or your Facebook video keeps buffering as you surf on the jetway. Unfortunately, with so many variables, there’s not a lot that the Wi-Fi Alliance or even the hot spot provider can always do. If there’s a business case for faster...

Read more

Pages

Subscribe to technical