Like it was a "safety hazard" to have too many people at once in the smoking area behind the textile mill. It is a slippery slope to try to tease apart what is, and what isn't, interference right?
Google also has a clause in their employment agreement that says you agree to them putting surveillance software on your devices, company supplied and personal, as a condition of your employment. I asked about that one, got the HR response "Well I suppose you could interpret it that way, but that isn't what we mean." and I said, "Okay, lets change it to say what you mean." and got the "Well we really aren't in a position to change these documents, it would be a mess trying to track a zillion individual agreements." etc etc. That rabbit hole of pushing back and forth leads to "perhaps Google isn't the right place for you." :-)
> Google also has a clause in their employment agreement that says you agree to them putting surveillance software on your devices, company supplied and personal, as a condition of your employment.
That doesn't mean it's enforceable or that it wouldn't run into statutory limits. Many workplace relations laws are written on a strict liability basis, intent isn't necessary for infringement. So if it has a chilling effect on organising, it's potentially infringing.
Mind you, there's something very rich in Google employees complaining about Chrome being used as a monitoring tool. Look around you, folks. What do you think pays for the fancy cafeteria? It's not hugs and smiles.
There were a number of things in the Google employment agreement that my attorney suggested were either unenforceable for legal reasons or overly broad, he also pointed out that companies generally don't retain people who sue them (the contract is employment 'at will'). His advice was that if you didn't like it, just quit. That is the low cost way of getting out from under the contract that left little ill will behind. Suing was messy and follows one around to future job interviews.
I'm generally of the view that knowingly adding unenforceable provisions to a contract should be a crime and/or professional malpractice worthy of being disbarred. It's obtaining advantage through unconscionable deception. Or, less fancily: fraud.
Entirely too much of this kind of thing is allowed to slide by legislatures.
"The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man."
An acquaintance loves that quote, but modifies it slightly, " ... Therefore all progress depends on the unreasonable man to get someone else to change it."
> Google also has a clause in their employment agreement that says you agree to them putting surveillance software on your devices, company supplied and personal, as a condition of your employment.
How would Google even know about your personal devices? That seems to only make sense if you intend to use your personal device for work.
The Google policy probably applies to personal devices used in BYOD work settings.
Lots of people at tech companies use personal devices for work. This includes things like having your work calendar, email, or corporate applications on your personal phone. In my company's case there is some kind of enterprise iOS policy that gets installed as part of onboarding a device to access company systems. This has the ability to enforce a password policy, monitor installed software and software versions, to lock out devices that are running known vulnerable apps or iOS versions, as well as the technical ability to remotely wipe the entire device (never heard of that being exercised). Company data resides on the device (via email, documents, and other means) so it's reasonable for them to want to have some oversight. An attacker could attempt to get into company systems via the device, after all, since it has (some limited) access to my corporate account.
I use my personal phone for work because it's extremely handy to have work email and calendar on my phone, and I don't want to carry two phones.
It's not even an unusual condition. At Dropbox, starting some time in either 2018 or 2019 we had to install a remote administration app on our devices if we wanted to connect to corp vpn or to log into corporate google accounts. Mostly so that the device could be wiped if it was lost, IIRC.
When I did some contract work for a big crypto exchange, they required you to provide them the ability to remote wipe any device you used, including your personal phone, for their systems. Since they required you to use Authy I just used an old phone with a throwaway gmail account to set Android up and the beat-to-hell Chromebook they provided that would not hold a charge for more than a few minutes.
I don't think the reply is about BYOD... There is speculation about IMSI catchers being used to target personal cell phones without any BYOD app installed.
Didn't a whistleblower at a car company suggest this...
And if such a a company had that technology then we can only speculate its use is more widespread than imagined.
Maybe this is the only reason remote jobs are not available at these companies... Nothing to do with being onsite just more of a precaution to vet employees before even allowing them to be remote.
So if a company could capture your personal text messages and personal cell phone calls while within an office campus (and no BYOD app installed)... would they use that knowledge to prevent poaching as well?
This is quite normal BYOD security policy. If you don't intend to use your personal devices for anything work-related (connecting to the same network, using related accounts) this policy shouldn't apply at all.
Not BYOD devices, any device you use to connect to the corporate servers. So if you read your corp email on your home computer, or access employee only services from your phone or laptop etc. One of the interesting things about working there was that much of what you needed to do could be done over the existing Google services.
As for how? It was left unspecified but we speculated they could drop a keylogger or other bit of surveillance kit if you logged into your corporate gmail account from a device where it wasn't already installed. Clearly crooks can do this, it has to be easier when you can sign your downloads and are a 'trusted' vendor.
> So if you read your corp email on your home computer, or access employee only services from your phone or laptop etc
I know literally no Google employees that use Google corp resources from personal computers. Phones are common, but as you're hopefully aware, the security considerations are different.
> As for how? It was left unspecified but we speculated they could drop a keylogger or other bit of surveillance kit if you logged into your corporate gmail account from a device where it wasn't already installed.
Who theorized this? Because it's certainly not a thing that's ever happened. It would be trivial to detect, either as a client, or by looking at the source of Gmail.
>Like it was a "safety hazard" to have too many people at once in the smoking area behind the textile mill. It is a slippery slope to try to tease apart what is, and what isn't, interference right?
No. It is a necessary thing to do and what a legal system ostensibly exists to do.
A slippery slope in what direction? In the direction of too many rules being illegal or too many things that could impede organizing being allowed?
Google also has a clause in their employment agreement that says you agree to them putting surveillance software on your devices, company supplied and personal, as a condition of your employment. I asked about that one, got the HR response "Well I suppose you could interpret it that way, but that isn't what we mean." and I said, "Okay, lets change it to say what you mean." and got the "Well we really aren't in a position to change these documents, it would be a mess trying to track a zillion individual agreements." etc etc. That rabbit hole of pushing back and forth leads to "perhaps Google isn't the right place for you." :-)