AI and Social Credit Scores in the workplace, the latest advent of capitalism
Now more than ever think twice where you write the message, think three times before you send it.
“Operational security” is basically a fancy term for not being stupid in how you conduct your business. For us, that’s merely averting termination. We need to work on this.
Here is how the world has evolved and how it impacts you.
Slack now has an official offering called AI Agents. And, they’ve long since allowed Enterprise Administrators to read public messages without you knowing.
lurel.io offer products which include “Sentiment Analysis”
posipanda offers a service which provides a toxicity and kindness score.
Identifying Toxic Behavior in Slack: Posipanda aims to foster a positive work culture by identifying and addressing communication challenges.
nightfall.ai offers a service to monitor for bullying using AI
SalesForce already makes use of Toxicity Scoring with LLM and AI, using it’s Einstein Generative AI. It also integrates with Slack.
Microsoft Teams makes “Sentiment Analysis” so easy, it’s actually in the coding example for a bot.
Zoom is also playing around with “Sentiment Analysis.”
Remember, at work — especially in a Right to Work state (right to get fired) — you have almost no rights nor expectation of privacy, according to Workplace Fairness
Employers can legally monitor almost anything an employee does at work as long as the reason for monitoring is important enough to the business. Employers may install video cameras, read postal mail and e-mail, monitor phone and computer usage, use GPS tracking, and more.
Operational security is not having conversations in unsafe venues about,
Your political opinions
Your pay
Your opinions of leadership and management
Those conversations shouldn’t be on the company provided communication medium. By all means, reach out and talk to your coworkers. Organize side channels. But don’t make it so easy that you get marked for spreading negativity on things they administer and maintain. Moreover, think twice before endangering a coworker with that. Most can talk about Trump and Biden and be level headed, but what about the one guy that gets emotional? Is it worth being a party to a conversation which can end up hurting those that get too emotional — even if their heart is in the right place?
Let’s say a hothead reaches out and they want to know what you thought of the HR meeting, or their own struggles at the workplace. Sympathizing with them publicly or reinforcing their perspectives gives them a permit to continue to vent on the medium. Why put them in a position to dig themselves into a hole by welcoming that conversation there? It can only ever hurt their career.
When it comes to how you behave at work, assume your employer will act to protect their workplace culture. That’s a valid and fair assumption. Yes, you’re losing out, but for your own sake consider how AI is giving them an edge. A dystopian corporate edge.