/ Alexa surveillance

Please Register as a New User in order to reply to this topic.
tripehound 10:20 Thu

This was in the Guardian yesterday. Normally you would think this was paranoia but on reading it its quite disturbing. Alexa is off from now in our household. 

Amazon is collecting our data (even our conversations at home) and they cannot be trusted with it. It could be used by corrupt administrations ( such as the current one in the USA. Its truly Orwellian.

https://www.theguardian.com/technology/2019/oct/09/alexa-are-you-invading-my-privacy-the-dark-side-of-our-voice-assistants

Report
summo 10:24 Thu
In reply to tripehound:

Wouldn't worry. Your mobile phone sat on the table near you now could easily be recording everything anyway. 

Post edited at 10:24
Report
tripehound 10:27 Thu
In reply to summo:

> Wouldn't worry. Your mobile phone sat on the table near you now could easily be recording everything anyway. 

If you read it Siri isn't as high a risk.

Report
summo 10:32 Thu
In reply to tripehound:

But can you 100% guarantee that any app isn't tracking your location, searches, conversations etc.. it's not like just putting tape over a laptop''s webcam. Even if you disable functions in an apps settings, you are trusting them. 

Report
ScottTalbot 10:36 Thu
In reply to tripehound:

> If you read it Siri isn't as high a risk.

Me and my friends have noticed (more with their iphones, than my Samsung) that our conversations have been recorded and adverts on facebook etc have been targeted with regards to what we have been discussing.

Try it with your iphones people.. Talk about wanting something that you have never searched for, so it's not in your history, and watch the adverts pop up!

Post edited at 10:37
Report
mullermn 11:23 Thu
In reply to ScottTalbot:

This is pure confirmation bias. There’s no evidence that this happens beyond people spotting these coincidences and jumping to conclusions. 

There’s not even a logical technical basis for this to work. All of these voice assistants currently operate on a similar pattern, they have a wake word (‘Hey Siri’, ‘Alexa’ etc) which the device itself listens for. It then captures audio and ships it off to the mothership where the sound is interpreted and the speech analysed. The devices themselves do not have the grunt to do it themselves. (You can test this - turn your phone in to airplane mode and try and use Siri). 

In order for what you describe to be happening these devices would have to be capturing audio and streaming somewhere the whole time for analysis. Your battery would be flat and the mobile networks would be overloaded under the burden of every device making the equivalent of a permanent Skype call 24/7. it simply doesn’t pass the sniff test. 

And that’s not to mention the fact that no security researchers have got themselves in the headlines by exposing evidence of it. 

Report
MarkJH 12:09 Thu
In reply to mullermn:

> There’s not even a logical technical basis for this to work. All of these voice assistants currently operate on a similar pattern, they have a wake word (‘Hey Siri’, ‘Alexa’ etc) which the device itself listens for. It then captures audio and ships it off to the mothership where the sound is interpreted and the speech analysed. The devices themselves do not have the grunt to do it themselves. (You can test this - turn your phone in to airplane mode and try and use Siri). 

But that explanation (in itself) suggest that the phones must have the ability to recognise the wake-words.  Presumably if you download a 3rd party app (and grant it permission to access your microphone), then there wouldn't be any technological barrier to them using different wake-words. E.g. "I was looking for" or "I want" and then capture audio around that.

I don't use the google assistant, but I have left all of the  permissions on.  Out of interest, I checked what recordings of me google had stored (available in your account settings), they all had variations on that theme ("I want", "I was looking").  Possibly (probably?) a coincidence, but it is slightly disconcerting to hear audio of yourself online that you weren't aware was being captured!

Report
Lusk 12:32 Thu
In reply to mullermn:

Does that mean I can't repeatedly say "I love dildos" into my mates iphone when he isn't listening and he won't get bombarded with dildo ads?

Disappointed of Manchester.

Report
mullermn 12:33 Thu
In reply to MarkJH:

I don’t believe this is necessarily correct. The problem of recognising one distinct pattern can be heavily optimised in a way that doesn’t extent to the general problem. 

See, for example, the various voice recognition systems that have been present in cars and other devices for years. There’s a reason they don’t allow you to configure your own phrases. 

Report
mullermn 12:34 Thu
In reply to Lusk:

I don’t think there’s any evidence it would work, sorry. If you can get access to his browser and search around for dildos for a bit that should work, however, as well as possibly broadening your horizons. 

Report
MarkJH 12:41 Thu
In reply to mullermn:

> I don’t believe this is necessarily correct. The problem of recognising one distinct pattern can be heavily optimised in a way that doesn’t extent to the general problem. 

> See, for example, the various voice recognition systems that have been present in cars and other devices for years. There’s a reason they don’t allow you to configure your own phrases. 

That is true, but those are all cases where the function is to directly interact with the user and a high error rate would be a usability problem.  If your only purpose is to keep the the amount of data used to a set (or maximum) level, then any voice recognition that is an improvement on random sampling would have some value.

Report
mullermn 12:48 Thu
In reply to MarkJH:

Ok, I’ll agree that is true, though it’d still probably show up in the form of knackering your battery life. But bear in mind what you’re saying - that Facebook etc are so interested in patchy, low quality data that might help them sell a few more adverts that they’ve instituted a global surveillance network that would immediate unleash a mountain of legal trouble their way if exposed. And nobody has managed to demonstrate it happening, despite the level of popular interest in the topic. 

It’s tinfoil hat territory. It’s like me saying that there are microphones in the lightbulbs because I talked about needing a shed and then a gardening catalogue dropped through the letter box. 

Post edited at 12:50
Report
MarkJH 13:01 Thu
In reply to mullermn:

> It’s tinfoil hat territory. It’s like me saying that there are microphones in the lightbulbs because I talked about needing a shed and then a gardening catalogue dropped through the letter box. 

Ok, that is fair enough.  More a comment of the technical aspects, and I agree that the big companies probably aren't doing it.  However, with 3rd party apps I'd be less sure.  There was a story a year or two back about software that was in some free to play games that used the microphone to identify what TV programmes people were watching in order to sell data to advertisers.  This didn't transmit raw audio, but did some sort of data reduction on the phone and sent a much smaller file back to the servers for matching.  Not quite the same thing, but it does show that many companies are not entirely upfront about why they want the particular permissions that they request.

Post edited at 13:01
Report
LastBoyScout 13:25 Thu
In reply to MarkJH:

I don't and wouldn't have Alexa, or any other version, in my house, nor will I be buying any form of smart TV.

I also turn off/deny any app permissions that don't seem in keeping with the app that is asking for it.

I'm a software developer and we do a certain amount of data logging in our software, using anonymised data, to check performance and usage statistics, for various reasons. There's nothing physically stopping me expanding the harvesting to capture other stuff the user might be up to.

Report
Snyggapa 13:35 Thu
In reply to LastBoyScout:

We thought we had disabled alexa/google/whatever and were joking about it one day with the little one. Stupidly said out loud "Alexa, what noise does a dog make". Voice from under the sofa started to tell us. A proper WTF moment - then we realised that we have an old amazon fire tablet that the little one watches films etc on when on long road trips and that didn't have alexa disabled on it.

I can confirm that the "alexa" recognition isn't much good for four years old kids as they shout too excitedly and quickly at it, so it doesn't hear the trigger word very well - but try telling the four year old that you're going to turn off the thing that she finds amusing.

I must check if we eventually disabled it...

Report
In reply to mullermn:

Amazon have admitted to eavesdropping on some customers for 'test purposes'.

If this is possible in some cases it is open to abuse.

Report
mullermn 13:44 Thu
In reply to DubyaJamesDubya:

> Amazon have admitted to eavesdropping on some customers for 'test purposes'.

> If this is possible in some cases it is open to abuse.

Citation needed on that one, I think. Where Alexa picks up some speech that it can’t understand they have been having human users review the speech to train the system (I believe this is now configurable?) but that’s a long way from ‘eavesdropping’, which implies something malicious.

Happy to concede I’m wrong if you can find a statement from Amazon ‘admitting’ it though.  

Report
mullermn 13:46 Thu
In reply to Snyggapa:

> I must check if we eventually disabled it...

This is another thing that baffles me about this. People think that all these apps etc are spying on them illicitly in defiance of all morality, but provided you tick the little box to turn the microphone off then suddenly all is well?

If you’ve made the leap that you think they’re spying on you, why would you trust them to stop when you ask them nicely?

Report
mullermn 13:48 Thu
In reply to DubyaJamesDubya:

Did you actually read that article? It supports exactly what I said.

Report
Timmd 13:50 Thu
In reply to mullermn:

> It’s tinfoil hat territory. It’s like me saying that there are microphones in the lightbulbs because I talked about needing a shed and then a gardening catalogue dropped through the letter box. 

I'm thinking that anything to do with speech recognition and search engines and social media is probably worth being cautious about, even if there isn't a concerted plot eavesdrop, just because of the fallibility of humans and their computer programming when it comes to safeguarding our privacy. 

Post edited at 13:50
Report
In reply to mullermn:

I see that Amazon have been able to take recordings from Alexa devices.

We are supposed to trust that their safeguards are foolproof and that their version of what they do with it is to be trusted and not open to abuse by any individual working for Amazon.

Report
mullermn 13:57 Thu
In reply to Timmd:

I would agree. The number one reason that this (apps spying on you the whole time) is t possible is that the local devices can’t do it unassisted and it’s not logistically possible to centralise it.
It won’t be long before the problem is well enough understood and the devices are powerful enough to do all the natural language interpretation locally - one of the companies (Amazon/Google/Apple.. can’t remember) has already been talking about this, and the new Amazon fire cube has the ability to control your media devices without requiring the cloud Alexa service. 

At that point the amount of data required to send your speech back home becomes trivial and definitely feasible en masse. The best defence against this is a society that’s wise to the problem so that the right protections can be baked in to the law and people’s expectations in time. The GDPR is a surprisingly forward thinking piece of legislation in this respect.

Report
mullermn 14:01 Thu
In reply to DubyaJamesDubya:

> I see that Amazon have been able to take recordings from Alexa devices.

Ah. This might be your misunderstanding. It’s not ‘been able to’, it’s inherent in the way it works. Everything that Alexa captures is stored on the internet and you can listen to it back via the app or the website.

The scenario you’re talking about is when Alexa captures some audio and can’t understand it (or you tell it the interpretation was wrong), in which case it may pass the audio to a human reviewer to try and work out what you were on about.

It’s nothing illicit, it’s just how it works. It’s like hiring a cleaner and then being outraged that there’s a stranger in your house.

Edit: it’s worth noting that while the IT systems take the criticism, the invasion of your privacy in the examples shown in the Forbes article are all being performed by good old fashioned meat-bag humans. If you hire that cleaner and leave your collection of fetishwear on view, she’s going to be laughing about it in the pub with her mates later. No IT required. 

Post edited at 14:09
Report
ColdWill 14:01 Thu
In reply to tripehound:

1st world problems....

Report
Siward 14:24 Thu
In reply to ColdWill:

Indeed. The world is rapidly becoming interconnected to a degree barely imaginable a decade or so ago. Won't be long until your consciousness is uploaded. Getting all possessive about your actually minimally valuable data is rather pointless. Either you're online, in the fullest sense, or you live off grid in the woods (even then, don't assume the robotic insects aren't listening to you...) 

Report
In reply to mullermn:

> Ah. This might be your misunderstanding. It’s not ‘been able to’, it’s inherent in the way it works. Everything that Alexa captures is stored on the internet and you can listen to it back via the app or the website.

> The scenario you’re talking about is when Alexa captures some audio and can’t understand it (or you tell it the interpretation was wrong), in which case it may pass the audio to a human reviewer to try and work out what you were on about.

> It’s nothing illicit, it’s just how it works. It’s like hiring a cleaner and then being outraged that there’s a stranger in your house.

> Edit: it’s worth noting that while the IT systems take the criticism, the invasion of your privacy in the examples shown in the Forbes article are all being performed by good old fashioned meat-bag humans. If you hire that cleaner and leave your collection of fetishwear on view, she’s going to be laughing about it in the pub with her mates later. No IT required. 

Fair enough to a point but if I'd hired a cleaner I would know not to leave out, say, gold bullion. But the point is that the meat bag humans got to listen to stuff that the people would have imagined was impossible. It seems to me that if someone unscrupulous was to set the Alexa device to a mode in which it decided everything it listened to needed assessing how would you know?

Ultimately you got a microphone in your house connected to the internet...

Report
mullermn 15:09 Thu
In reply to DubyaJamesDubya:

> It seems to me that if someone unscrupulous was to set the Alexa device to a mode in which it decided everything it listened to needed assessing how would you know?

> Ultimately you got a microphone in your house connected to the internet...

Well, there’s two branches to the answer. If you assume that everyone’s ‘playing fair’ then you’d know (with Alexa) because it lights up when it’s listening, and you can also set it to chime when it starts/ends listening if you want. 

If you assume that the device is ‘playing dirty’ then all bets are off. As you say, it’s a device with a microphone connected to the internet. At that point you’re reliant on a 3rd party catching it red handed and publicising it (it’s worth noting that there is a large community of people who would love to be able to do that).

The problem with the ‘playing dirty’ view is where do you stop? You (or atleast me) carry a phone around with you all day, you spend all day surrounded by laptops with microphones, speaker phones, hands free kits, cars with speaker phones, security cameras... if you go too far down that rabbit hole you’ll end up in a cabin in the woods. You certainly can’t hang out with other people, because they’re all carrying internet connected microphones too.

You have to make a judgement call about what’s realistically plausible and react accordingly. If you don’t like the idea that an Amazon worker somewhere might hear some garbled rubbish after Alexa misheard the activation word, then you probably don’t want an Alexa device at home. Personally nothing all that exotic happens in my kitchen and there are a number of really useful features in the Alexa service, so we’ve got one and I’ve not lost any sleep over it.  
 

Report
krikoman 15:12 Thu
In reply to tripehound:

Google Israeli NSO, if you think Iphones are safe

Report
mullermn 15:19 Thu
In reply to krikoman:

> Google Israeli NSO, if you think Iphones are safe

If you’ve attracted the attention of a nation state’s security services I don’t think your choice of phone manufacturer is your major concern. That’s a somewhat different issue to whether Facebook is listening to you for advertising purposes. 

edit: it’s also worth saying that the main thing that keeps you safe from that sort of exploit is how uninteresting you are ;) these vulnerabilities do exist, and companies like that make money exploiting them, but they are few, far between and precious. They do not risk exposing them and getting them closed down just to find out what cereal you like. 

Post edited at 15:29
Report
mullermn 15:34 Thu
In reply to krikoman:

If you want to see what can really be done when you have that level of resources and you’re prepared to blow them by exposing them, read up on the Stuxnet worm.

https://www.google.co.uk/amp/s/www.csoonline.com/article/3218104/what-is-stuxnet-who-created-it-and-how-does-it-work.amp.html

long story short, created by Israeli/US secret services to detail Iran’s uranium production.  Delivered using infected usb sticks left where people with physical access to the facility would find them. Once plugged in to a PC on the right network it then used a series of 4 zero-day exploits (zero-day = completely unknown to the security community) to propagate around the network until it found the centrifuge controllers which it then modified to make them run slightly too fast while reporting that everything was ok, thereby damaging them. That’s proper spy movie stuff.

Report
In reply to mullermn:

"That’s proper spy movie stuff."

It's good, but you clearly have not been keeping up with the biggest spy story doing the rounds right now. WAGatha Christie Coleen Rooney exposing Rebekha Vardy as an instagram mole through techniques Ian Fleming would have been proud of.

Report
krikoman 17:22 Thu
In reply to mullermn:

> They do not risk exposing them and getting them closed down just to find out what cereal you like. 

Ha ha, obviously didn't look them up or what they can do. there was a Radio 4 programme I suggest you listen to. Their software is for hire, supposedly to combat terrorism, and yet more likely to anyone who can pay the price, at least according to the program.

Report
Timmd 17:22 Thu
In reply to Bjartur i Sumarhus: I saw on TV a teacher talking about some 9 year old girls talking about wanting to be WAGs, celebrity culture is a pain in the bum.

Post edited at 17:22
Report
krikoman 17:24 Thu
In reply to mullermn:

Read and have seen a few programs about Stuxnet, and had to do some work on live systems to try and protect them.

Stuxnet, needed someone to introduce the worm onto the Iranian network, the NSO stuff doesn't need that.

Report
mullermn 17:37 Thu
In reply to krikoman:

> Ha ha, obviously didn't look them up or what they can do. there was a Radio 4 programme I suggest you listen to. Their software is for hire, supposedly to combat terrorism, and yet more likely to anyone who can pay the price, at least according to the program.

I didn’t look them up beyond the first line of the google results no, but I’m familiar with the type of organisation. The fact remains that if they’re sitting on unknown vulnerabilities in order to give them an advantage then they can’t be using the tools too widely, or someone will work out what the vulnerability is and fix it. There’s no way around that, technology isn’t magic.

Also some of the companies in this space have occasionally been a little disingenuous about exactly what they can and can’t do. Super secret infiltration techniques and unbiased peer review do not make good bedfellows.

Report
mullermn 17:39 Thu
In reply to krikoman:

> Stuxnet, needed someone to introduce the worm onto the Iranian network, the NSO stuff doesn't need that. 

well, if you leave your phone in airplane mode permanently then it will be airgapped the same as the Iranian centrifuges and it too will require someone to introduce the worm manually. I do accept this may reduce the utility of your phone. 

Report

Please Register as a New User in order to reply to this topic.