us to new ideas

Last week, we focused on the difference between scholarly and popular sources. While popular sources do have an important place in research, there are important considerations for searching for these resources on the Internet. This discussion introduces you to the concept of the “Internet Filter Bubble” and how it can affect your search results when using certain search engines, such as Google. The second discussion this week will examine how to pop the filter bubble.
Prepare Icon Prepare: Watch the Ted Talk Eli Pariser: Beware Online “Filter Bubbles” (transcript) and read the Document icon How to Pop Your Filter Bubble! handout.
Reflect Icon Reflect: Consider your reaction to the video and how this topic applies to your own experience researching on the internet. Think about the suggestions from the How to Pop Your Filter Bubble! handout and select three that you feel will help you pop your filter bubble.

Write Icon Write: Consider your reaction to the video and how this topic applies to your own experience researching on the internet. Think about the suggestions from the How to Pop Your Filter Bubble!handout and select three that you feel will help you pop your filter bubble.
Write: Answer the following questions in your post.

What were your initial thoughts on the filter bubble after watching the Ted Talk?
What are the positive and negative effects of the filter bubble, particularly in relation to ethical issues that may arise?
How could this filter bubble impact the research you conduct online for your Final Paper, the Annotated Bibliography?
Which three suggestions for popping your Internet filter bubble did you select? Explain why you chose those three.
Do you feel popping your filter bubble is important for all of the searches (i.e., professional, academic, personal) you conduct online or only some? Why or why not?
To maximize the opportunity for vigorous discussion, you must respond to at least one classmate. Post to this discussion on at least three separate days of the week. Your posts must total at least 400 words after you address the questions noted above. Your first post must be completed by Day 3 (Thursday) and the remainder of your posts must be completed by Day 7 (Monday). You must answer all aspects of the prompt at some point during the week. Also, be sure to reply to your classmates and instructor.

video transcript

Mark Zuckerberg,
a journalist was asking him a question about the news feed.
And the journalist was
asking him,
“Why is this so important?”
And Zuckerberg said,
“A squirrel dying in your front yard
may be
more relevant to your interests right now
than
people dying in Africa.”
And I want to talk about
what a
Web based on that idea of relevance might look like.
0:40
So
when I was growing up
in a really rural area in Maine,
the Internet meant something very
different to me.
It meant a connection to the world.
It meant something that would connect us all
together.
And I was sure that it was going to be great for democracy
and for our society.
But there’s this
shift
in how information is flowing online,
and it’s invisible.
And if we don’t pay attention to it,
it could
be a real problem.
So I first noticed this in a place I spend a lot of time

my Facebook page.
I’m
progre
ssive, politically

big surprise

but I’ve always gone out of my way to meet conservatives.
I like
hearing what they’re thinking about;
I like seeing what they link to;
I like learning a thing or two.
And so I
was surprised when I noticed one day
that
the conservatives had disappeared from my Facebook
feed.
And what it turned out was going on
was that Facebook was looking at which links I clicked on,
and
it was noticing that, actually,
I was clicking more on my liberal friends’ links
than on my conserva
tive
friends’ links.
And without consulting me about it,
it had edited them out.
They disappeared.
1:54
So Facebook isn’t the only place
that’s doing this kind of invisible, algorithmic
editing of the
Web.Google’s doing it too.
If I search for something, a
nd you search for something,
even right now at
the very same time,
we may get very different search results.
Even if you’re logged out, one engineer
told me,there are 57 signals
that Google looks at

everything from what kind of computer you’re on
to
wha
t kind of browser you’re using
to where you’re located

that it uses to personally tailor your query
results.Think about it for a second:
there is no standard Google anymore.
And you know, the funny thing
about this is that it’s hard to see.
You can’t se
e how different your search results are
from anyone else’s.
2:42
But a couple of weeks ago,
I asked a bunch of friends to Google “Egypt”
and to send me screen
shots of what they got.
So here’s my friend Scott’s screen shot.
And here’s my friend Daniel’s sc
reen
shot.
When you put them side

by

side,
you don’t even have to read the links
to see how different these
two pages are.
But when you do read the links,
it’s really quite remarkable.
Daniel didn’t get anything
about the protests in Egypt at all
in his fi
rst page of Google results.
Scott’s results were full of them.
And
this was the big story of the day at that time.
That’s how different these results are becoming.
3:21
So it’s not just Google and Facebook either.
This is something that’s sweeping the Web.
There are a
whole host of companies that are doing this kind of personalization.
Yahoo News, the biggest news site
on the Internet,
is now personalized

different people get different things.
Huffington Post, the
Washington Post, the New York Times

a
ll flirting with personalization in various ways.
And this moves
us very quickly
toward a world in which
the Internet is showing us what it thinks we want to see,
but not
necessarily what we need to see.
As Eric Schmidt said,
“It will be very hard for peop
le to watch or
consume something
that has not in some sense
been tailored for them.”
4:05
So I do think this is a problem.
And I think, if you take all of these filters together,
you take all these
algorithms,
you get what I call a filter bubble.
And your
filter bubble is your own personal,
unique
universe of information
that you live in online.
And what’s in your filter bubble
depends on who you are,
and it depends on what you do.
But the thing is that you don’t decide what gets in.
And more
importantly,
y
ou don’t actually see what gets edited out.
So one of the problems with the filter
bubble
was discovered by some researchers at Netflix.
And they were looking at the Netflix queues, and
they noticed something kind of funny
that a lot of us probably have no
ticed,
which is there are some
movies
that just sort of zip right up and out to our houses.
They enter the queue, they just zip right
out.
So “Iron Man” zips right out,and “Waiting for Superman”
can wait for a really long time.
5:02
What they discovered
wa
s that in our Netflix queues
there’s this epic struggle going on
between
our future aspirational selves
and our more impulsive present selves.
You know we all want to be
someonewho has watched “Rashomon,”
but right now
we want to watch “Ace Ventura” for th
e fourth
time.(Laughter)
So the best editing gives us a bit of both.
It gives us a little bit of Justin Bieber
and a little
bit of Afghanistan.
It gives us some information vegetables;
it gives us some information dessert.
And
the challenge with these kind
s of algorithmic filters,
these personalized filters,
is that, because they’re
mainly looking
at what you click on first,
it can throw off that balance.
And instead of a balanced
information diet,
you can end up surrounded
by information junk food.
5:59
Wh
at this suggests
is actually that we may have the story about the Internet wrong.
In a broadcast
society

this is how the founding mythology goes

in a broadcast society,
there were these
gatekeepers, the editors,
and they controlled the flows of infor
mation.
And along came the Internet and
it swept them out of the way,
and it allowed all of us to connect together,
and it was awesome.
But
that’s not actually what’s happening right now.
What we’re seeing is more of a passing of the torch
from
human gatek
eepers
to algorithmic ones.
And the thing is that the algorithms
don’t yet have the kind of
embedded ethics
that the editors did.
So if algorithms are going to curate the world for us,
if they’re
going to decide what we get to see and what we don’t get to
see,
then we need to make sure
that
they’re not just keyed to relevance.
We need to make sure that they also show us things
that are
uncomfortable or challenging or important

this is what TED does

other points of view.
7:03
And the thing is, we’ve ac
tually been here before
as a society.
In 1915, it’s not like newspapers
were sweating a lot
about their civic responsibilities.
Then people noticed
that they were doing
something really important.
That, in fact, you couldn’t have
a functioning democracy
if
citizens didn’t get
a good flow of information,
that the newspapers were critical because they were acting as the filter,
and
then journalistic ethics developed.
It wasn’t perfect,
but it got us through the last century.
And so
now,
we’re kind of back in
1915 on the Web.
And we need the new gatekeepers
to encode that kind of
responsibilityinto the code that they’re writing.
7:51
I know that there are a lot of people here from Facebook and from Google

Larry and Sergey

people who have helped build the
Web as it is,
and I’m grateful for that.
But we really need you to make
sure
that these algorithms have encoded in them
a sense of the public life, a sense of civic
responsibility.
We need you to make sure that they’re transparent enough
that we can see wh
at the
rules are
that determine what gets through our filters.
And we need you to give us some control
so that
we can decide
what gets through and what doesn’t.
Because I think
we really need the Internet to be
that thing
that we all dreamed of it being.
W
e need it to connect us all together.
We need it to introduce
us to new ideas
and new people and different perspectives.
And it’s not going to do that
if it leaves us all
isolated in a Web of one.
8:45
Thank you.
8:47
(Applause)