Digital Sensors

We live in the information age which means we are more connected than ever before, yet many of us can find it a bewildering and somewhat complex connected corporate jungle to operate in. For example, how many times have you felt overwhelmed by the number of emails you have received or by the vast amount of content available on-line? This lecture explores the physiological and psychological impact of the digital communication network on you and the organisation and how by understanding and modelling these factors we can better prepare ourselves for the ever increasing multifaceted communication tools available to us. As we continue to create our own digital tree overtime, we will also discuss how the information is being used to forecast crime and change the marketing of products and services forever.

In this lecture Professor Tom Jackson explores how the information age has transformed digital communication, how to determine if you and your digital tree are showing signs of stress and information overload, and strategies you can adopt to alleviate these.

Video Transcription

0:00:13.519,0:00:17.680
good evening ladies and gentlemen
so great to see so many faces i know and

0:00:17.680,0:00:20.560
some new faces and
thank you so much for giving up your

0:00:20.560,0:00:22.960
time to come and share my inaugural
lecture

0:00:22.960,0:00:29.199
tonight so the connected jungle
and the digital tree what does that mean

0:00:29.199,0:00:30.960
there’s three things i want to cover
tonight

0:00:30.960,0:00:35.760
one is what does that mean looking at
your digital tree and the analog tree

0:00:35.760,0:00:43.360
what’s the impact of social media on you
and the future who is using your data

0:00:43.360,0:00:47.120
that you’re feeding into the network

0:00:48.480,0:00:52.079
so when i was asked to do the inaugural
lecture it must have been over a year

0:00:52.079,0:00:55.440
ago i was in the
my office and phd student was there

0:00:55.440,0:00:59.120
junior i was telling him all about the
things we’ve done over the years

0:00:59.120,0:01:02.640
and i said just cannot come up with a
catchy title he said well

0:01:02.640,0:01:05.760
if you think about the corporate jungle
this is very much like

0:01:05.760,0:01:10.000
a connected jungle i thought brilliant
we are all connected

0:01:10.000,0:01:14.080
within this jungle either by analog or
via digital

0:01:14.080,0:01:17.920
and within this jungle we have a digital
tree

0:01:17.920,0:01:21.200
and what makes up your digital tree is
either email

0:01:21.200,0:01:28.240
tweets facebook text anything digital
but also you’ve got an analog tree so

0:01:28.240,0:01:31.040
that might be
all the things that keeping you in your

0:01:31.040,0:01:34.240
brain it might be anything that’s
written down

0:01:34.240,0:01:39.280
on paper and the complexity is trying to
navigate

0:01:39.280,0:01:45.280
your way through this digital
jungle there’s lots of information out

0:01:45.280,0:01:47.360
there
how do you find out the right

0:01:47.360,0:01:51.920
information for you to complete a task
if you think about large organizations

0:01:51.920,0:01:57.200
maybe like some of the banks
where they have over 350 000 employees

0:01:57.200,0:02:00.640
trying to actually find the right
information without having to duplicate

0:02:00.640,0:02:03.759
it
and little pockets of silos is very

0:02:03.759,0:02:08.160
complex
my question to you is when you’re

0:02:08.160,0:02:12.640
thinking about your digital tree which
we all have on now

0:02:12.640,0:02:16.400
does it worry you that that’s maybe
exposed to others across the world

0:02:16.400,0:02:23.280
so they can see what you’re up to
analog tree is somewhat restricted and

0:02:23.280,0:02:27.120
you can’t really see that but the
digital tree is open to all

0:02:27.120,0:02:30.319
so how does that change in terms of
business

0:02:30.319,0:02:34.160
does it add complexity that you’ve got a
digital tree and analog tree to

0:02:34.160,0:02:37.360
navigate off here on the slide here you
can see

0:02:37.360,0:02:41.200
a picture from facebook this actually
shows all the connections

0:02:41.200,0:02:45.920
of facebook account users in the world
such as facebook that’s pretty much all

0:02:45.920,0:02:49.200
the globe
few major countries missing but if you

0:02:49.200,0:02:52.720
start adding twitter
or weeble you start to get a nice

0:02:52.720,0:02:55.200
picture of the whole earth and how it’s
actually connected

0:02:55.200,0:02:57.840
digitally

0:02:59.120,0:03:03.760
so the complex digital age is it complex
if you think about our students that go

0:03:03.760,0:03:09.519
into the workplace it is complex i think
you’ve got all this employee knowledge

0:03:09.519,0:03:12.560
they need to try
and tap into the right people who knows

0:03:12.560,0:03:16.800
what and we’ll come on to that
in a bit more detail in a moment you’ve

0:03:16.800,0:03:21.440
got lots of processes and systems
some are quite antiquated the system’s

0:03:21.440,0:03:25.200
been in place since the company started
they’ve never really replaced them

0:03:25.200,0:03:28.319
they’re not really fit for purpose
but they just muddle along just keep

0:03:28.319,0:03:30.640
going
then when you’re trying to find

0:03:30.640,0:03:34.560
information you’ve got the internet so
you’ve got about four billion pages

0:03:34.560,0:03:38.640
of information out there to try and sift
through

0:03:38.640,0:03:44.480
so information retrieval is quite key
on top of that you’ve got about 54

0:03:44.480,0:03:48.239
gigabytes of unstructured
data to sift through in any one

0:03:48.239,0:03:52.560
organization
so trying to find the right information

0:03:52.560,0:03:57.680
can be quite complex
so let’s take the brain so

0:03:57.680,0:04:02.400
you know how many bytes of information
does a brain hold

0:04:02.400,0:04:05.760
any ideas in terms of bites we’re
talking megabytes

0:04:05.760,0:04:11.200
talking terabytes i’ll put you out your
misery

0:04:11.200,0:04:15.760
based on the number of junctions between
neurons brain contains about 100 billion

0:04:15.760,0:04:20.400
neurons so each has about 20 000 shared
synaptic junctions

0:04:20.400,0:04:25.360
so if we think of those as being a bit a
brain can hold between 100

0:04:25.360,0:04:30.400
and 1000 terabytes of information
so if we think about the analog that’s

0:04:30.400,0:04:35.040
an awful lot of information
that an organization potentially holds

0:04:35.040,0:04:38.320
so trying to find the right person with
that information is difficult

0:04:38.320,0:04:42.080
and trying to elicit that information is
even harder

0:04:42.080,0:04:47.120
let’s put it into context
so thousand terabytes you can go home

0:04:47.120,0:04:51.360
tonight and tell people
all around you that your brain is the

0:04:51.360,0:04:56.000
size of a forest
so how do i get that calculation one

0:04:56.000,0:05:00.639
terabyte is about 50
000 trees made into paper and printed

0:05:00.639,0:05:04.160
so if we go for a thousand terabytes
we’re looking at 50

0:05:04.160,0:05:10.160
million trees so your brain is the size
of 50 million trees

0:05:10.160,0:05:13.600
i’m looking at chris here smiling

0:05:14.400,0:05:20.960
so in terms of the amazon there’s four
billion trees in the amazon rainforest

0:05:20.960,0:05:24.800
so we need about 8 000 people to make up
that rainforest

0:05:24.800,0:05:27.919
so trying to navigate through if you
think the size of the amazon’s huge

0:05:27.919,0:05:31.759
trying to navigate through that
is complex so even though we think the

0:05:31.759,0:05:35.280
digital world is complex
the analog world is still complex and

0:05:35.280,0:05:39.199
over the years we’ve done lots of
work on knowledge management research

0:05:39.199,0:05:42.080
but we’re not going to put much of that
in here tonight just to let you know

0:05:42.080,0:05:44.880
it does exist

0:05:45.280,0:05:49.280
so have you ever seen these before
colleagues in sb would have seen these

0:05:49.280,0:05:53.600
many a time
this is what happens every 60 seconds

0:05:53.600,0:05:57.759
around the globe so look top right hand
corner

0:05:57.759,0:06:06.160
up here we’ve got 98 000 tweets
sent every minute we have 25 hours of

0:06:06.160,0:06:10.080
youtube
video uploaded every minute i’m sure

0:06:10.080,0:06:13.680
that’s not eloise that does all that

0:06:14.319,0:06:18.319
and in terms of email there are i think
168 million

0:06:18.319,0:06:21.919
emails every 60 seconds that’s
phenomenal

0:06:21.919,0:06:27.840
amounts of information that’s 2012.
forget that we’ll move on so we’ve gone

0:06:27.840,0:06:31.520
from 25 hours of youtube video up to 72
hours

0:06:31.520,0:06:35.759
of youtube video uploaded we’ve
introduced google searches so we have

0:06:35.759,0:06:40.960
two million google searches
every minute emails increase to 204

0:06:40.960,0:06:44.800
million
and twitter or number tweet is up to 278

0:06:44.800,0:06:48.479
000 so the rate of growth in terms of
information

0:06:48.479,0:06:53.599
is phenomenal the latest one i could get
is this one here

0:06:53.599,0:06:57.520
we’ve gone from 2 million searches to
2.66 million google searches at the very

0:06:57.520,0:07:02.800
top
interestingly a decrease in email

0:07:02.800,0:07:10.400
so from 204 million to 138.8
138.8 million it’s a slight decrease

0:07:10.400,0:07:13.759
and then in terms of tweets we’re now up
to 433

0:07:13.759,0:07:19.120
000 tweets every minute
phenomenal just think about those

0:07:19.120,0:07:22.960
numbers all that information
where is it going where has it been

0:07:22.960,0:07:26.800
stored can it be utilized

0:07:27.120,0:07:31.360
now when we think about social media
sometimes you think yes it’s just the

0:07:31.360,0:07:34.560
young that use
social media looking at this graph you

0:07:34.560,0:07:36.560
see that it’s not at the very top you
can see

0:07:36.560,0:07:40.000
different colors and that represents a
different age group so not 17

0:07:40.000,0:07:45.520
18 to 24. you can see generally
there’s a very good spread until you get

0:07:45.520,0:07:49.520
to
65 plus but of course

0:07:49.520,0:07:54.879
as the population is aging all the time
and it’s growing i expect that to

0:07:54.879,0:08:00.000
increase even further
so you’re getting a good representation

0:08:00.000,0:08:05.280
really of people using social media

0:08:06.639,0:08:10.000
the one thing that really surprises me
how things are changing in terms of

0:08:10.000,0:08:13.680
digital present
presence so if you think about uh the

0:08:13.680,0:08:17.039
birth
of a baby the digital present now exists

0:08:17.039,0:08:20.879
three months before the analog birth of
the baby and that’s because people

0:08:20.879,0:08:24.400
know the gender of the baby so they
create a website they have

0:08:24.400,0:08:28.400
facebook pages all to do with that
particular baby

0:08:28.400,0:08:31.759
so it’s just growing all the time i
recommend you’ve got time

0:08:31.759,0:08:38.320
to have a look at the the nice article
on bbc about how big data is changing

0:08:38.839,0:08:44.080
lives
so the analog and digital tree i went

0:08:44.080,0:08:46.880
down to
london to tower london uh you probably

0:08:46.880,0:08:49.839
remember this picture from november last
year and took some

0:08:49.839,0:08:53.760
great pictures of all the poppies really
moving but it started to make me think

0:08:53.760,0:08:56.519
about okay go back to maybe the first
world war

0:08:56.519,0:09:03.360
1914 did they have a digital tree
no did they have an analog tree yes they

0:09:03.360,0:09:06.839
did
but it’s probably tiny probably about

0:09:06.839,0:09:11.360
128k
put into a put into a computer maybe if

0:09:11.360,0:09:16.959
you’re trying to put in computer terms
maybe a year maybe in the whole lifetime

0:09:16.959,0:09:22.720
move on to 2005 and this is at nbc’s one
of their shows and you can see when

0:09:22.720,0:09:26.720
looking at the show that’s going to
begin and you can see maybe just one or

0:09:26.720,0:09:29.120
two people i think there’s another one
somewhere

0:09:29.120,0:09:32.320
of someone holding up their mobile phone
taking an image

0:09:32.320,0:09:36.880
they want a personalized image of that
event even though it’d be on tv

0:09:36.880,0:09:41.519
move to 2013 pretty much everybody out
there

0:09:41.519,0:09:45.120
is trying to take a picture of what’s
going on all

0:09:45.120,0:09:48.560
pretty much all copies of the same image
they’re taking but it’s just their

0:09:48.560,0:09:53.279
personalized
image for themselves but what that means

0:09:53.279,0:09:59.440
is with the digital age we are all
sensors we are all putting stuff into

0:09:59.440,0:10:03.360
the grid
so every text every every tweet

0:10:03.360,0:10:09.920
every facebook post anything digital we
are putting into that grid

0:10:11.360,0:10:15.120
so what does your digital footprint look
like

0:10:15.120,0:10:18.800
so if you have time this is how i
calculated there’s a web link at the

0:10:18.800,0:10:22.560
bottom that’s quite good to see
so this could be email facebook

0:10:22.560,0:10:27.279
surveillance cameras
um online banking digital photographs so

0:10:27.279,0:10:30.240
i calculated it
for myself and it turns out to be for a

0:10:30.240,0:10:36.640
year or just under a year
614 gigabytes of information in just one

0:10:36.640,0:10:40.320
year
you think back to the people in 1914

0:10:40.320,0:10:44.079
well
maybe 128k max in a year maybe in the

0:10:44.079,0:10:46.480
whole lifetime
if you’re trying to put that into

0:10:46.480,0:10:51.680
digital context
so the rate of information the growth is

0:10:51.680,0:10:54.320
phenomenal
and what we’re starting to find is

0:10:54.320,0:10:57.440
because it’s so big this is where we
have big data

0:10:57.440,0:11:01.839
what can we do to try and optimize and
utilize

0:11:01.839,0:11:04.880
all this information

0:11:05.760,0:11:09.519
so what i want to do is take it right
back to basics and look at what can we

0:11:09.519,0:11:13.920
learn
from the oldest social media tool out

0:11:13.920,0:11:20.079
there which is email which is 44
years old

0:11:20.480,0:11:25.760
email was invented by a guy called ray
tomlinson in 1971 and he

0:11:25.760,0:11:29.760
invented the at symbol in the current
format of the email address and he

0:11:29.760,0:11:35.680
didn’t think anything of it at the time
but now it’s nearly everywhere

0:11:35.680,0:11:39.760
we can learn probably quite a lot from
email communication because the other

0:11:39.760,0:11:43.680
social media tools out there now will
also go through the same sort of growth

0:11:43.680,0:11:47.519
and development
so email started out being very basic we

0:11:47.519,0:11:50.800
now try
we tried to get email to do much more

0:11:50.800,0:11:55.279
than it was actually invented
for and it’s grown very organically when

0:11:55.279,0:11:59.279
emails installed
within many organizations you’re not

0:11:59.279,0:12:04.480
actually told or trained how to use it
there’s email get on and use it things

0:12:04.480,0:12:07.040
have also changed in terms of legal
requirements

0:12:07.040,0:12:12.399
so in terms of retention of documents
so if you mention anything to do

0:12:12.399,0:12:17.279
asbestos in an email
you’ve got to store that for 50 years so

0:12:17.279,0:12:21.040
no longer it’s just a personal email
going from one person to the next

0:12:21.040,0:12:24.639
we’ve got legal ramifications that we
must start to store some of this

0:12:24.639,0:12:27.279
information
and this might come in terms of social

0:12:27.279,0:12:32.639
media in terms of inter
into organizations the biggest change

0:12:32.639,0:12:36.320
is what we call wets work extendable
technologies and

0:12:36.320,0:12:40.800
i haven’t got mine on me but mobile
phones ipads

0:12:40.800,0:12:44.399
i wonder how many people here are guilty
of having their

0:12:44.399,0:12:48.399
email on their phone and just before
they go to bed they probably just look

0:12:48.399,0:12:50.639
at the email i’m looking at bob to see
if he does that

0:12:50.639,0:12:54.079
so you just check your email before you
go to bed and

0:12:54.079,0:12:57.680
there’ll be some tasks there which might
keep you up all night because you start

0:12:57.680,0:13:00.480
to worry about them
and we start to become much more

0:13:00.480,0:13:05.040
addicted and needing
to actually check our email and we’ll

0:13:05.040,0:13:08.560
come on to that in more depth

0:13:08.639,0:13:12.240
we’ve done i suppose the last 17 years
we’ve done a number of

0:13:12.240,0:13:17.120
email studies looking at the email
landscape the latest one was the uk

0:13:17.120,0:13:23.120
government agency and
we as a phd student laura who did the

0:13:23.120,0:13:26.079
work
uh and jill supervised her as well with

0:13:26.079,0:13:29.760
myself and
mike gleason helped us with all the

0:13:29.760,0:13:33.519
processing of the cortisol samples
so we went to this uk government

0:13:33.519,0:13:38.639
agencies we
had about just over 30 participants some

0:13:38.639,0:13:41.839
dropped out
and we monitored them so they had a

0:13:41.839,0:13:44.880
blood pressure monitor put on them for
24 hours

0:13:44.880,0:13:48.000
a day and they also you had your blood
pressure

0:13:48.000,0:13:51.519
and you had your heart rate this is the
unpleasant part

0:13:51.519,0:13:55.680
they had to spit into a test tube six
times a day

0:13:55.680,0:13:59.440
and poor laura then had to put that in
the fridge and later freeze that to take

0:13:59.440,0:14:04.320
it back to loughborough to analyze it
cortisol is your fight or flight so when

0:14:04.320,0:14:07.199
you come up against it
you need to draw upon your adrenaline

0:14:07.199,0:14:10.880
which then draws upon your cortisol
and your cortisol levels are very high

0:14:10.880,0:14:14.000
in the morning about nine o’clock and
they decrease throughout the day

0:14:14.000,0:14:19.360
to they pretty much not depleted but run
right down towards the evening

0:14:19.360,0:14:22.959
they also did lots of questionnaires and
we also monitor them at the desk to find

0:14:22.959,0:14:26.399
out what they’re actually doing
and maybe it’s not a shot but what we

0:14:26.399,0:14:32.079
did find is
email does cause stress

0:14:32.079,0:14:35.839
so this chart here left hand side is a
cortisol level

0:14:35.839,0:14:40.240
and underneath here’s the time so your
cortisol levels start very high

0:14:40.240,0:14:44.079
in the morning like i said and they
decrease throughout the day

0:14:44.079,0:14:47.920
and the blue line is the email day so
they started very similar

0:14:47.920,0:14:51.519
you can see they decrease more with the
blue line the email day

0:14:51.519,0:14:55.279
then the email free day which pretty
pretty

0:14:55.279,0:14:59.839
pretty much higher than the uh the blue
line there

0:15:00.959,0:15:05.920
what we did notice if um well people
were less stressed employees were less

0:15:05.920,0:15:10.399
stressed if they filed their emails away
so as they came into the inbox the

0:15:10.399,0:15:13.279
people that left them in the inbox
were more stressed than the people

0:15:13.279,0:15:20.000
actually decided to fold them away
we also found that email isn’t that bad

0:15:20.000,0:15:26.480
compared to other devices like
mobile phones speaking of mobile phones

0:15:26.480,0:15:31.600
or face-to-face meetings
they all cause stress it’s just because

0:15:31.600,0:15:34.800
the frequency of email and how we deal
with it

0:15:34.800,0:15:38.160
causes more stress

0:15:38.560,0:15:42.160
so i had an attempt here talking to the
ras and the phd students i had

0:15:42.160,0:15:45.759
my first attempt at doing an infographic
and they’re much harder than i thought

0:15:45.759,0:15:50.639
as you’d probably tell so here ill the
email facts and i tried to put them into

0:15:50.639,0:15:53.279
one
infographic so 17 years worth of work

0:15:53.279,0:15:57.759
and just one
or half a page is quite sad really so

0:15:57.759,0:16:03.839
the pie chart here on the right hand
side let’s do some analysis of an inbox

0:16:03.839,0:16:09.920
you can see the green 15 of your inbox
requires action

0:16:09.920,0:16:15.680
39 of your
inbox is just information only at the

0:16:15.680,0:16:18.720
top here
so it doesn’t need to be delivered by

0:16:18.720,0:16:23.279
email 29
is unnecessary so it’s carbon copy and

0:16:23.279,0:16:26.639
we all hate those reply to all it
happens a lot i think in every

0:16:26.639,0:16:29.759
department that you always get the reply
to full and they didn’t realize

0:16:29.759,0:16:37.199
how to use email and then the 17
that are irrelevant so

0:16:37.199,0:16:41.440
if you think about your makeup of your
inbox really far too much information is

0:16:41.440,0:16:44.480
going into your
email account and we can really reduce

0:16:44.480,0:16:48.639
that
we did a study at the danwood group many

0:16:48.639,0:16:51.839
years ago
looking at interrupt recovery time and

0:16:51.839,0:16:55.040
we discovered
that people do seem to be addicted to

0:16:55.040,0:16:59.199
email so staff would
react to email the majority of staff

0:16:59.199,0:17:03.279
react to email
within six seconds of it arriving

0:17:03.279,0:17:06.400
which is incredible so you’re kind of
like hovering over

0:17:06.400,0:17:10.000
the icons at the bottom waiting to see
if an email comes in oh that’s much more

0:17:10.000,0:17:13.839
interesting than work let’s go and see
what the email says

0:17:13.839,0:17:22.400
you act like you’re guilty there i guess
and then so as soon as you go to email

0:17:22.400,0:17:24.880
you deal with email you get back into
the work you’re doing

0:17:24.880,0:17:29.200
it then takes 64 seconds for you to get
back into the work you were doing

0:17:29.200,0:17:35.039
before the email interruption that’s far
better than tom demarco’s finding when

0:17:35.039,0:17:38.799
it comes to telephone interrupts
which they say it takes 10 to 15 minutes

0:17:38.799,0:17:42.320
to get back into
uh into your work after a telephone

0:17:42.320,0:17:46.320
interruption
so email is not not so bad the problem

0:17:46.320,0:17:50.559
we have
is that potentially the old systems you

0:17:50.559,0:17:53.679
used to check for email the default
which you check for email every five

0:17:53.679,0:17:57.679
minutes
so if you had a new email coming in in a

0:17:57.679,0:18:00.320
in a working day you could potentially
have like 96

0:18:00.320,0:18:04.400
interruptions in a day so if one comes
in every five minutes

0:18:04.400,0:18:07.600
and you process it you only got three
and a half minutes before your next

0:18:07.600,0:18:12.320
interruption
if every new email is a new task and you

0:18:12.320,0:18:17.039
can’t close that task down
that starts to build up and the brain

0:18:17.039,0:18:19.840
can only deal with between 8 and 15
tasks

0:18:19.840,0:18:25.120
at any one time so over that
you’re starting to become overloaded so

0:18:25.120,0:18:30.720
at the bottom here the sign of overload
here is productivity is reduced

0:18:30.720,0:18:34.480
when you leave the office in the evening
you feel quite fatigued your creativity

0:18:34.480,0:18:38.640
goes
and missing the quality of work

0:18:38.640,0:18:42.799
decreases and the sign of that is in the
middle there

0:18:42.799,0:18:47.600
so 87 of staff that we surveyed over
loads of different organizations we

0:18:47.600,0:18:53.120
won’t mention them all suffer from email
overload so it is a big issue

0:18:53.120,0:18:57.440
one in three this is the worrying one
email dependent

0:18:57.440,0:18:59.919
so i don’t know if you take away their
blackberry or their computer what

0:18:59.919,0:19:02.960
happens if they go into meltdown i don’t
know we didn’t go that far

0:19:02.960,0:19:06.799
but it’s interesting 53 cannot deal with
email

0:19:06.799,0:19:12.080
anymore just find too much they don’t
have time to do the work anymore

0:19:12.080,0:19:15.600
so it’s a big problem we looked at
trying to solve this problem so we did

0:19:15.600,0:19:18.480
lots of seminar-based training so we get
lots of employees

0:19:18.480,0:19:21.760
into the classroom teach them how to do
better email

0:19:21.760,0:19:26.080
that worked for a time and after a month
they’ll go back into their bad

0:19:26.080,0:19:29.760
their old habits which are the bad
habits so we then came up with

0:19:29.760,0:19:34.960
a computer-based training system and we
worked with ann on this and martin

0:19:34.960,0:19:38.160
called rems which is the ranking of
email management styles

0:19:38.160,0:19:41.919
this is a system that we started to roll
out across the university

0:19:41.919,0:19:45.919
and it’s also going out to easyjet and
wwf

0:19:45.919,0:19:48.880
you’re probably thinking what wrestling
want with an email system no it’s the

0:19:48.880,0:19:52.480
wildlife foundation that are using it
what it does

0:19:52.480,0:19:55.600
is it looks at your email usage so it
looks at how good you are

0:19:55.600,0:19:59.840
writing an email it looks at how you
manage your inbox

0:19:59.840,0:20:05.600
and finally are you addicted to
email it will then so you send off your

0:20:05.600,0:20:08.640
outbox
to the system it then gives you a score

0:20:08.640,0:20:11.440
and ranks you
within the university to see how good

0:20:11.440,0:20:20.000
you are
bob your secret’s safe with me

0:20:20.000,0:20:25.200
so what we want to know
if you move it forward is can we

0:20:25.200,0:20:27.919
determine
how good we are at processing

0:20:27.919,0:20:32.080
information so it’s no good just doing
one size fits all you should only have

0:20:32.080,0:20:34.640
so many email
could we actually break it down and look

0:20:34.640,0:20:39.120
at the factors
that contribute to information overload

0:20:39.120,0:20:43.360
and we started a bit of work
about this about two years ago maybe

0:20:43.360,0:20:45.200
probably a bit longer and we came up
with this

0:20:45.200,0:20:48.559
so this is an information overload model
uh

0:20:48.559,0:20:52.480
ideally we’d like to make it into
mathematical formula but we’re not quite

0:20:52.480,0:20:55.120
there yet
so on the left-hand side this is you’ve

0:20:55.120,0:20:59.360
got not overloaded
overloaded and the tipping point in the

0:20:59.360,0:21:01.600
middle
so the left-hand side you have a bit

0:21:01.600,0:21:05.840
more control over on the right-hand side
you have less control so let’s quickly

0:21:05.840,0:21:10.720
whip through these so the first one is
the quality of information so how

0:21:10.720,0:21:13.520
complex is the information you’re trying
to process

0:21:13.520,0:21:18.720
the ambiguity level how novel and how
uncertain it is that’s the first factor

0:21:18.720,0:21:22.880
next one is processing capacity so in
here we all have

0:21:22.880,0:21:26.799
brains that can can process things or
information differently so

0:21:26.799,0:21:33.440
mathematically or literacy uh
might be different so the speed you can

0:21:33.440,0:21:36.559
do it
changes from person to person the other

0:21:36.559,0:21:39.280
one is available time so if you had all
the time in the world you

0:21:39.280,0:21:44.640
probably complete most tasks within
a business environment but we don’t have

0:21:44.640,0:21:49.760
all the time in the world
quantity of information so there might

0:21:49.760,0:21:53.360
not be much out there on the web
which could be a good thing but then

0:21:53.360,0:21:56.240
what’s the quality like
so if the quantity is low and the

0:21:56.240,0:21:59.039
quality is poor where are you going to
get your information

0:21:59.039,0:22:03.120
from to complete the task it makes it
much harder then you’ve got the task

0:22:03.120,0:22:06.400
how novel is the task that you’ve seen
before have you seen it before

0:22:06.400,0:22:10.720
are you multitasking being interrupted
all the time

0:22:11.039,0:22:15.039
finally the personal factors so i think
um maybe you’ve had a row with your

0:22:15.039,0:22:16.880
partner the night before and your mind’s
not quite

0:22:16.880,0:22:20.320
on the task that you’re trying to
complete so they’re all the different

0:22:20.320,0:22:23.919
factors that make up potentially
information overload and with the

0:22:23.919,0:22:27.679
students i think it’s always quite dry
so we tried to bring it to life

0:22:27.679,0:22:30.799
and what i want to try and do tonight
this could fail miserably but we’ll give

0:22:30.799,0:22:34.960
it a go anyway
is a simulation of um helicopters here

0:22:34.960,0:22:40.480
we have a helicopter pilot
over a very densely populated

0:22:40.480,0:22:43.919
city so tonight you’re going to be the
densely populated city when we’re going

0:22:43.919,0:22:48.000
to
fly the helicopters and

0:22:48.000,0:22:51.039
this is what it looks like a cockpit and
what we’re going to simulate is

0:22:51.039,0:22:54.080
what happens so we’re going to have two
volunteers which are pre-selected so

0:22:54.080,0:22:58.240
there’s no pressure so we’ve got
becker and uh paul haven’t they have

0:22:58.240,0:23:02.080
they disappeared where are they do you
want to come on down

0:23:02.480,0:23:06.000
have they disappeared no they are coming

0:23:07.520,0:23:11.039
so our job is to try and keep these
helicopters in the air

0:23:11.039,0:23:14.559
and on the computer we’re going to
simulate going through

0:23:14.559,0:23:19.039
uh a helicopter when it hits problems
and we see at the end who can keep them

0:23:19.039,0:23:22.640
in the air and who can’t
okay so guys are we nearly ready to take

0:23:22.640,0:23:37.440
off
yeah you ready okay so we’ll start

0:23:37.440,0:23:40.640
oh we crashed already

0:23:47.919,0:23:55.760
so i’m trying to get mine
right oh we’ve got some crashes already

0:23:55.760,0:23:58.640
so as we’re going

0:24:00.559,0:24:06.000
the helicopter starts to hit some
problems so as a pilot

0:24:06.000,0:24:13.120
as the pilot i’m starting to think
how do i fix this problem

0:24:15.600,0:24:19.279
so the buzz is going lots of things are
going through my mind this doesn’t feel

0:24:19.279,0:24:22.080
particularly good

0:24:23.520,0:24:26.559
should i get the manual out and start to
look shall i look and google

0:24:26.559,0:24:35.039
how do i fix this particular problem
i’ve got pictures of my family going

0:24:35.039,0:24:40.159
around in my head i’m not really
concentrating on the task

0:24:41.600,0:24:44.960
it seems to be going a bit funny

0:24:45.200,0:24:48.400
am i taking you out becca

0:24:48.720,0:25:01.840
and then suddenly it all stops

0:25:12.400,0:25:15.840
we have light entertainment

0:25:16.960,0:25:24.559
great thanks guys for that so
the outcome is we talk about office

0:25:24.559,0:25:26.880
environments
information overload you deal with it

0:25:26.880,0:25:31.279
don’t you but if you are
a pilot and i hope uh these or myself to

0:25:31.279,0:25:33.840
actually become pilots after what we’ve
just done there

0:25:33.840,0:25:37.919
it can be fatal uh you either land
correctly or you don’t

0:25:37.919,0:25:41.360
but going through that in terms of us
trying to simulate here

0:25:41.360,0:25:44.559
you’re going through very much you’re
looking at us we’re trying to process

0:25:44.559,0:25:47.200
this
how novel was the environment how novel

0:25:47.200,0:25:50.400
was it to fly the helicopter we can
start to maybe model

0:25:50.400,0:25:53.840
and start to predict if you’re likely to
suffer from

0:25:53.840,0:25:57.840
information overload

0:25:59.840,0:26:04.880
so we talked about using social media

0:26:05.120,0:26:09.200
and what it’s used for and we looked at
email as an example

0:26:09.200,0:26:14.720
of if it’s not used correctly how it can
actually start to affect you

0:26:14.720,0:26:20.159
now i want to look at what
is your data that you’re feeding into

0:26:20.159,0:26:25.360
the network used for
so if i said what if you could determine

0:26:25.360,0:26:29.440
the emotions of an audience
watching a tv program so you think

0:26:29.440,0:26:31.840
something like the
if you heard about the x factor is where

0:26:31.840,0:26:34.799
people sing could you it’s a real live
show

0:26:34.799,0:26:38.000
could you get real information back in
real time about

0:26:38.000,0:26:41.840
these acts and if they’re any good or
any bad if they’re not very good and a

0:26:41.840,0:26:44.000
few
factors are being affected could you

0:26:44.000,0:26:47.360
change them real time
to put something more entertaining on

0:26:47.360,0:26:49.600
what about if you knew about the
emotions

0:26:49.600,0:26:53.919
going into a commercial break so if you
knew the emotions of the audience

0:26:53.919,0:26:58.559
watching
could you develop or commission

0:26:58.559,0:27:02.240
a advert for specific emotions to tap
into them so they could help

0:27:02.240,0:27:07.039
sell a service or product you think
about is it i think red nose day is

0:27:07.039,0:27:10.720
coming up
march 17th something like that when

0:27:10.720,0:27:15.600
they’re showing those
video clips of the children in africa

0:27:15.600,0:27:20.640
how long should they keep those clips on
for if you start to know the emotions

0:27:20.640,0:27:26.159
and at what point people start to pick
up the phone and donate some money

0:27:26.159,0:27:29.919
well that all comes from and potentially
could be done

0:27:29.919,0:27:36.399
by using all your data out there
we had a project funded by

0:27:36.399,0:27:40.159
the ministry of defense and epsrc from
the research council who

0:27:40.159,0:27:43.200
part of the research council called
emotiv

0:27:43.200,0:27:49.440
and this is it and what it does in
simple terms we could use any social

0:27:49.440,0:27:52.720
media
tool but we chose twitter because it’s

0:27:52.720,0:27:57.200
very easy to get hold of
tweets so we want to take a tweet

0:27:57.200,0:28:02.240
and then see is there any emotion
related to that particular tweet and we

0:28:02.240,0:28:05.600
came up with uh
eight different emotions in the end and

0:28:05.600,0:28:08.880
they’re cross-cultural so we had fear
sadness which you can see at the very

0:28:08.880,0:28:13.440
bottom here anger
confusion disgust happiness shame and

0:28:13.440,0:28:16.640
surprise
now tonight we’re only going to show you

0:28:16.640,0:28:19.440
the really shiny bits we’re not going to
show you and we wouldn’t want to show

0:28:19.440,0:28:23.279
you how it all works
the integral parts of it but there is a

0:28:23.279,0:28:25.600
lot of natural
language processing so it’s taking

0:28:25.600,0:28:29.600
natural language and we’re trying to get
the computer to understand that

0:28:29.600,0:28:33.440
and on top of that we have what’s called
an ontology which is kind of like a map

0:28:33.440,0:28:36.720
of words
and it shows how words relate to other

0:28:36.720,0:28:41.200
words so for example
anger we have about 100 words in the

0:28:41.200,0:28:44.880
ontology
different words that represent anger we

0:28:44.880,0:28:47.279
also have intensifiers so you can be
very angry

0:28:47.279,0:28:51.440
very sad and that then gives you a score
before potentially

0:28:51.440,0:28:58.159
each tweet now that can be very powerful
and i’ll give you an example why after

0:28:58.159,0:29:01.440
this slide
so why bother looking at any social

0:29:01.440,0:29:05.039
media what does actually tell us well
if you remember the uprising egyptian

0:29:05.039,0:29:08.320
activists we use facebook to schedule
our protest

0:29:08.320,0:29:12.000
twitter to coordinate and youtube to
tell the world

0:29:12.000,0:29:15.600
social media also can break the news
first

0:29:15.600,0:29:21.520
so we had mumbai attacks in 2008
2009 jakarta bombings earthquakes in

0:29:21.520,0:29:26.000
japan
all reported first by our social media

0:29:26.000,0:29:29.200
so this comes back to you guys feeding
into

0:29:29.200,0:29:33.279
the network and it’s not just us
creating a motive there are other

0:29:33.279,0:29:37.279
commercial people out there
trying to do it so here i won’t mention

0:29:37.279,0:29:39.039
them all but they’re all here at the
bottom

0:29:39.039,0:29:42.320
currently our emotive system is the best
in the world

0:29:42.320,0:29:47.600
at determining fine-grained emotion so
no other system gets close to it

0:29:47.919,0:29:53.200
the problem we do have with any tweets
is sarcasm

0:29:53.200,0:29:56.799
so i don’t know if you remember the mega
bus incident

0:29:56.799,0:30:00.480
on the m6 for memory uh i think it was
an

0:30:00.480,0:30:04.320
e-cigarette there’s some passages on the
on the mega bus

0:30:04.320,0:30:08.080
uh something was vaporized it looked
like some sort of chemical warfare

0:30:08.080,0:30:11.120
a passenger then on the on the uh
megabus

0:30:11.120,0:30:14.880
phone emergency services and all this
happened here

0:30:14.880,0:30:17.919
and i think it’s great these two
comments watching the news in tesco’s

0:30:17.919,0:30:20.799
bit confused about what’s happened on
the m6 toll road

0:30:20.799,0:30:25.520
this for me is sarcasm 2001 terrace
hijacked planes

0:30:25.520,0:30:29.360
2012 terrorism alert on a megabus on the
m6 toll

0:30:29.360,0:30:33.200
this recession has hit hard

0:30:33.840,0:30:38.720
so for computer to try and understand
that it’s difficult

0:30:40.640,0:30:45.200
let me take you back a few years to um
woolwich and i think you probably all

0:30:45.200,0:30:50.640
can remember the brutal
killing of uh soldier lee rigby

0:30:50.640,0:30:55.840
this system is quite important for
law enforcement agencies to be developed

0:30:55.840,0:30:58.880
because if you remember the london
rights can you start to predict

0:30:58.880,0:31:03.200
when a riot might happen so the emotive
system

0:31:03.200,0:31:06.880
was monitoring the lee rigby event so
you can see here the percentage of

0:31:06.880,0:31:09.919
emotions on the left hand side
the different colors on the right hand

0:31:09.919,0:31:14.159
side you can see there
and the timeline at the bottom so we

0:31:14.159,0:31:17.600
highlight
or move into two of the colors there

0:31:17.600,0:31:21.519
which
are sadness and disgust

0:31:21.519,0:31:27.519
so you can see they’re moving along here
and what what what what might they lead

0:31:27.519,0:31:29.440
to who knows
you can actually drill in and look at

0:31:29.440,0:31:33.840
the tweets could it lead to right
nobody really knows until you gather

0:31:33.840,0:31:38.240
more data
but what’s interesting is you’ve got a

0:31:38.240,0:31:41.279
big
outburst of happiness why would you have

0:31:41.279,0:31:45.360
happiness in such a sad situation
and how did that change the dynamics

0:31:45.360,0:31:49.679
after that well the happiness
was to do with people being very proud

0:31:49.679,0:31:53.519
of woolwich
and also the wife of lee rigby coming

0:31:53.519,0:31:56.799
out and saying some very nice words
which changed the whole dynamic

0:31:56.799,0:32:02.399
of the event which maybe
i find a little bit scary because if

0:32:02.399,0:32:06.799
lauren law enforcement agencies are
starting to use this

0:32:06.799,0:32:10.720
no longer do you have to send out troops
or police maybe it’s just a

0:32:10.720,0:32:16.240
war of words that you can start to
change feelings via social media

0:32:17.120,0:32:22.399
what about products so we looked some
time ago at uh the iphone s and iphone

0:32:22.399,0:32:25.919
c martin did some great job on the
graphics here

0:32:25.919,0:32:29.840
and the analysis so this the s was the
more expensive one the c

0:32:29.840,0:32:32.880
was the cheap one you can see generally
uh

0:32:32.880,0:32:36.240
there’s a lot more happiness on the s
the more expensive one than the cheap

0:32:36.240,0:32:39.679
one
if you put into a bar chart

0:32:39.679,0:32:44.960
you um will see you know if you’re doing
this apple doing this again

0:32:44.960,0:32:49.360
the cheap one uh people aren’t so happy
with a lot of anger confusion

0:32:49.360,0:32:53.519
in terms of happiness not so high a lot
of sadness

0:32:53.519,0:32:58.000
so if you’re trying to develop a product
maybe you would start thinking about

0:32:58.000,0:33:00.799
real time
monitoring a product or service to get

0:33:00.799,0:33:02.960
real-time feedback which then could
change

0:33:02.960,0:33:06.320
you could go out there and be proactive
and try

0:33:06.320,0:33:10.080
and make the customers that bought a
very cheap one that hasn’t got the

0:33:10.080,0:33:13.679
functionality
better somehow by maybe developing a new

0:33:13.679,0:33:17.120
service or an app they could release

0:33:18.000,0:33:24.480
we are using emotive engine for
it’s a working title called vote b

0:33:24.480,0:33:30.880
so running up to the general election
we’re gonna launch a free and a fee app

0:33:30.880,0:33:34.720
and it will check uh track the emotions
of the public towards

0:33:34.720,0:33:40.159
politicians uh parties and policies
so be interesting to see what people

0:33:40.159,0:33:44.240
think of farage and see if there’s
sadness happiness we don’t know but it’d

0:33:44.240,0:33:46.559
be quite interesting to see so that
would be launched

0:33:46.559,0:33:53.039
soon this is the bit i suppose
we’ve done email for 17 years and

0:33:53.039,0:33:56.799
throughout that we’ve done lots of i’ve
as a team we’ve done lots of things and

0:33:56.799,0:34:00.159
the the thing that
interests me and really gets me going in

0:34:00.159,0:34:03.279
the morning at the moment
is the embedded intelligence side but it

0:34:03.279,0:34:08.240
also has a bit of a scary element
to it been working with andy and

0:34:08.240,0:34:15.440
and paul and will and chris and lisa
on this and the guys have developed

0:34:15.440,0:34:20.879
a system a board which has sensors on so
it could have a temperature sensor it

0:34:20.879,0:34:24.480
could have vibration sensor you could
pretty much have any sense you want

0:34:24.480,0:34:27.839
and then it’s linked to an rfid tag that
then gives that information

0:34:27.839,0:34:31.359
out to the outside world which is the
interesting thing

0:34:31.359,0:34:36.800
so you could put this
the tags the embedded intelligence tags

0:34:36.800,0:34:40.960
into pretty much
any component in any car

0:34:40.960,0:34:44.480
so if you took your break sorry if you
took your car

0:34:44.480,0:34:49.359
in to have its brake calipers changed
and they change them in the garage you

0:34:49.359,0:34:52.800
could go and get back into your car
turn on your ignition and say these are

0:34:52.800,0:34:57.839
not the recommended brake calipers
from this manufacturer if you continue

0:34:57.839,0:35:01.200
you’re going to invalidate your warranty
so you could go back and look hang on

0:35:01.200,0:35:03.680
guys in the garage this is wrong i asked
for the

0:35:03.680,0:35:07.040
you know the premium product you’ve not
given it to me or

0:35:07.040,0:35:10.640
you know that’s the case and you think
fine in validate my warranty i don’t

0:35:10.640,0:35:14.079
care
but what it does give you also from the

0:35:14.079,0:35:18.640
manufacturer’s point of view
you can see how that new component works

0:35:18.640,0:35:22.320
with the other components in the car
so you’re getting real-time information

0:35:22.320,0:35:26.720
back the impact of that component on
other components

0:35:26.720,0:35:31.599
what it also does in the future uh you
don’t have to go on gocompare.com and

0:35:31.599,0:35:36.240
get all the emails from them anymore
because we will know i might say we know

0:35:36.240,0:35:40.320
it doesn’t have to be in cars
how are you driving your car so it gets

0:35:40.320,0:35:44.480
how fast you’ve been driving
how violent how the vibration everything

0:35:44.480,0:35:49.359
so you can start to work out your car
premiums when it comes to recalling cars

0:35:49.359,0:35:53.440
you remember toyota
and they did a big recall maybe you

0:35:53.440,0:35:56.640
don’t have to record all the cars
you just need to recall the cars that

0:35:56.640,0:36:01.920
you know are at danger
we’ve had a phd student who’s nearly

0:36:01.920,0:36:05.520
finished called mike hurst and he’s been
working myself from a shooter

0:36:05.520,0:36:08.880
and he’s been looking at emotions so
looking at the face and trying to

0:36:08.880,0:36:12.320
determine emotions
and looking at pupil size and also

0:36:12.320,0:36:16.320
determining motions
so we’re also working with bmw on this

0:36:16.320,0:36:20.000
so if you could put that into a car
potentially you could start to work out

0:36:20.000,0:36:24.400
the emotion of the driver in that car
which could lead to all sorts of things

0:36:24.400,0:36:28.480
so think about driving along the road
it knows you’re in a happy mood you come

0:36:28.480,0:36:32.079
along it’s an advert board there and it
shows you a nice advert that taps into

0:36:32.079,0:36:35.599
your happiness
you could also link it to your gps so

0:36:35.599,0:36:38.960
you can find out which the happy roads
in the in the uk or the world or the sad

0:36:38.960,0:36:42.560
roads
but it’s quite worrying that you can

0:36:42.560,0:36:46.400
potentially share all this information
with each other i think and it’s that

0:36:46.400,0:36:49.680
sort of question is that a good thing or
is that a bad thing

0:36:49.680,0:36:53.440
but to make that happen it’s okay giving
us all the information

0:36:53.440,0:36:56.720
you need to start to bring it all
together and this is the exciting

0:36:56.720,0:37:00.960
bit this is what’s going to change a lot
of things out there so we heard about

0:37:00.960,0:37:04.800
the semantic web and maybe the internet
of things before

0:37:04.800,0:37:08.000
and we heard about ontologies but i
think ontologies really have not

0:37:08.000,0:37:12.320
taken off necessarily because they lack
so much information that needs to be in

0:37:12.320,0:37:14.160
there
so antarctica looks pretty much like

0:37:14.160,0:37:19.119
this you have a car
okay and a car has three wheels or has

0:37:19.119,0:37:21.920
four wheels
well which one is it well you have to

0:37:21.920,0:37:24.640
build another ontologies to show that
it’s either three-wheel

0:37:24.640,0:37:29.280
or four-wheel it has no probability an
ontology

0:37:29.280,0:37:33.680
but you get the idea it links things
together which is very nice

0:37:33.680,0:37:36.720
but what we want to try and do and we’ve
got

0:37:36.720,0:37:41.119
pretty much along the way to getting
there is looking at the probabilities

0:37:41.119,0:37:45.359
having different modes so you can
construct anything in ontology

0:37:45.359,0:37:49.520
and have the probabilities you can
control anything via the ontology

0:37:49.520,0:37:54.000
you can do fault-finding simulation
prediction and potentially self-healing

0:37:54.000,0:37:59.119
so it’s bringing all that data together
and having the ontology control

0:37:59.119,0:38:02.960
everything
the hard one and that’s why it started

0:38:02.960,0:38:06.400
is the learning on ontology
so whilst the ontology is out there in

0:38:06.400,0:38:09.520
the real environment
how can you actually relate or how do

0:38:09.520,0:38:12.320
you actually add new nodes to an
ontology

0:38:12.320,0:38:15.920
without a human doing that and that’s
that’s very difficult

0:38:15.920,0:38:20.079
and yeah we’re nowhere near that at the
moment real-time data so you’ve got

0:38:20.079,0:38:23.760
real-time data going into this ontology
so a good example would be

0:38:23.760,0:38:28.960
um a car crash and you could work out
from the embedded sensors

0:38:28.960,0:38:31.920
what went wrong with the car potentially
there’s lots we could have done tonight

0:38:31.920,0:38:35.440
but i thought this is probably
the nice one not it’s a car crash but

0:38:35.440,0:38:39.440
it’s just a nice simple one to look at
so in terms of the car crash why did it

0:38:39.440,0:38:41.520
fail so you could look at the left hand
side

0:38:41.520,0:38:45.200
is it three wheels is it four wheels did
we have a blowout and one the tyres we

0:38:45.200,0:38:47.119
could probably get that from one of the
sensors

0:38:47.119,0:38:50.480
so probability was four wheels but it
probably wasn’t

0:38:50.480,0:38:56.720
a blowout if you look uh here so
was it to do the brakes well probably

0:38:56.720,0:38:59.359
not looking at the sensors the
information we got back from them

0:38:59.359,0:39:01.920
they’re pretty normal
what’s it to do with a fuel tank some

0:39:01.920,0:39:06.000
sort of search in the fuel tank
uh no probably wasn’t to do with that

0:39:06.000,0:39:08.960
hang on let’s have a look then so is it
to do with the driver from looking at

0:39:08.960,0:39:12.800
the sensors within the car
yes we know it’s not a computer we know

0:39:12.800,0:39:16.800
it’s a human
driver again from the sensors so then

0:39:16.800,0:39:19.680
what caused the crash
well then we can start looking at maybe

0:39:19.680,0:39:23.359
a bit of fuzzy logic in terms of well
was it the processing

0:39:23.359,0:39:26.400
capacity of the individual driving that
car

0:39:26.400,0:39:32.079
so let’s say stereotype stereo
um well i shouldn’t do that but we will

0:39:32.079,0:39:37.280
so if we say it’s an old person
driving a car i think i can get away

0:39:37.280,0:39:40.160
with it
um processing capacity might be good but

0:39:40.160,0:39:42.480
if you’ve got the grandchildren in the
car

0:39:42.480,0:39:45.760
you’ve got your sat nav talking to you
got the radio on

0:39:45.760,0:39:49.680
the chances are it’s much harder to
process the information especially when

0:39:49.680,0:39:54.160
maybe it’s dark outside and it’s snowing
so you still need a bit of fuzzy logic

0:39:54.160,0:39:57.599
but you can then start to determine from
this from the embedded sensors

0:39:57.599,0:40:02.480
using the ontology what potentially
caused a crash

0:40:02.480,0:40:07.040
so that is pretty much the way forward i
see the next two to three years

0:40:07.040,0:40:12.560
but it’s a lot a lot of work to try and
achieve that

0:40:13.440,0:40:17.520
so trying to bring things pretty much to
a close

0:40:17.520,0:40:21.359
i i don’t know if you find it scary or
not all this embedded intelligence and

0:40:21.359,0:40:23.599
you’re one of the senses feeding into
the grid

0:40:23.599,0:40:26.720
but i did a lecture this morning to the
first years and they all thought it’s

0:40:26.720,0:40:30.720
absolutely fine no problem you can take
all my data you can use it i have no

0:40:30.720,0:40:34.480
problems with that
but talking to other people i think the

0:40:34.480,0:40:38.400
older generation my generation
yes i think we are a little bit scared

0:40:38.400,0:40:41.200
about that
but you can see potential benefits if

0:40:41.200,0:40:45.599
you think about the tesco club card
and if cancer research got together that

0:40:45.599,0:40:48.960
could be a powerful combination
you know exactly what a person’s

0:40:48.960,0:40:53.200
purchase so you know the food the fat
intake the salt intake the sugar intake

0:40:53.200,0:40:56.800
maybe the chemicals they’ve bought in
terms of detergents and then you can

0:40:56.800,0:40:59.280
start to see if you link that cancer
research data

0:40:59.280,0:41:05.440
can you see what might lead to
cancer so there’s a benefit there but

0:41:05.440,0:41:08.640
then potentially you’re starting to
predict life expectancy

0:41:08.640,0:41:13.200
so uh premiums on life insurance is no
longer a gamble you know exactly what

0:41:13.200,0:41:15.920
you’re doing
pensions you know exactly where you sit

0:41:15.920,0:41:18.400
with the pot of money because you know
how long people

0:41:18.400,0:41:21.440
will live so the question for me is how
far

0:41:21.440,0:41:24.319
do we want to go

0:41:24.640,0:41:28.720
it’s not all bad news social media
command centers now

0:41:28.720,0:41:34.480
pop up this is where companies many
hotels actually buy into these they

0:41:34.480,0:41:38.240
monitor everything you say on social
media so if you have something bad to

0:41:38.240,0:41:41.920
say about
a hotel chain they’re beyond it saying

0:41:41.920,0:41:45.839
sorry sir sorry to hear about this what
can we do to help

0:41:45.839,0:41:49.839
so it’s creating jobs it’s good for the
economy

0:41:50.240,0:41:54.560
so to conclude really remember we are
just

0:41:54.560,0:41:58.240
a sensor in this connected jungle and
we’re feeding into this

0:41:58.240,0:42:02.400
connected jungle and i don’t know if
that’s good or bad

0:42:02.400,0:42:05.760
but one thing we’ve got to do is make
sure we have our own coping

0:42:05.760,0:42:08.960
strategies there’s a wealth of
information out there

0:42:08.960,0:42:12.400
on a daily basis how do we cope with all
that information

0:42:12.400,0:42:15.520
how do you cope so when you go home
tonight and you sit and go

0:42:15.520,0:42:18.880
to work tomorrow are you going to change
the way you deal with

0:42:18.880,0:42:22.400
email are you going to start to think
about actually most evenings i go home

0:42:22.400,0:42:25.760
i’m quite fatigued
can i start to change that can you

0:42:25.760,0:42:31.599
increase the quality of your life
i think it is an exceedingly exciting

0:42:31.599,0:42:34.880
area to be working in and it’s rapidly
growing

0:42:34.880,0:42:39.200
but we’ve got to shape it and what i
mean by that is how far do we go with

0:42:39.200,0:42:41.680
all this
how far do we go by bringing everything

0:42:41.680,0:42:44.560
together and having one system that can
control

0:42:44.560,0:42:47.920
potentially everything now we’re many
years off that but we can

0:42:47.920,0:42:54.640
eventually get there so to finish on
just to prove uh a point really

0:42:54.640,0:43:01.359
is a that’s not you yet angus hang on
there’s a video i’d just like to show

0:43:01.359,0:43:04.160
you which i don’t know if you’ve seen
before

0:43:04.160,0:43:12.240
i think the quality might not be
brilliant let’s have a look

0:43:12.240,0:43:15.839
the computer’s just thinking about it

0:43:18.240,0:43:20.880
still thinking

0:43:24.800,0:43:27.839
ah it’s a shame

0:43:30.000,0:43:36.079
quality let’s just check it’s not in the
wrong way i think it’s not

0:43:36.079,0:43:41.520
well what you would have seen is
uh which is probably a good thing really

0:43:41.520,0:43:45.119
is um i’ll just check it isn’t there

0:43:45.440,0:43:48.640
it’s worked before but it’s not working
tonight on this camera

0:43:48.640,0:43:52.960
on this uh helicopter is actual camera
and what i want to show you is that we

0:43:52.960,0:43:57.119
have had all your faces
tonight on here on the video uh which

0:43:57.119,0:44:00.640
sometimes again it comes down to ethics
do you know you actually been recorded

0:44:00.640,0:44:02.480
and what’s your data going to be used
for

0:44:02.480,0:44:06.839
so unfortunately you’re all safe and
your faces are not on the video

0:44:06.839,0:44:11.359
tonight it’s worked every other day but
not today there we go

0:44:11.359,0:44:27.839
anyway thank you uh for coming tonight
and i hope you enjoyed the talk

0:44:27.839,0:44:29.920
you