U.S. defense office testing AI software could one day lead to frameworks to watch employees, specialists have warned. The venture Defense Security Service (DSS) screens all employees with top – mystery leeway online action – including messages, web – based use of life, and visited sites.

It is ready to identify ‘ micro changes ‘ in the behavior of workers seeking evidence of dishonest representatives and the future dangers they may present. The framework would use screening forms to examine representative information from their online movement and the data they gave.

If the pilot proves successful it could be a model for corporate AI’s future, civil liberties groups suggest. It raises questions as to how closely companies in the future should track the digital lives of their employees.

The technology was brought to light by an in – depth report on military developments by Patrick Tucker for Defense One, a US specialist publication. A British civil liberties and privacy campaigning organization, Griff Ferris, legal and policy officer at Big Brother Watch, says it sets a ‘ worrying precedent. ‘

‘Using artificial intelligence to screen carefully everything representatives can do in the workplace just as their own lives are trying to foresee what they are going to do in the future, ‘ he said to MailOnline. ‘ Since it dissolves the rule that we are honest until proven liable, individuals should not be placed under surveillance at work without a doubt.’

Invariably, military technology is marketed and used by the private sector. Past examples include the GPS tracking system that was developed for the military in the 1970s and is now using the global tracking system. The new pilot system is based on an urgent need to get the security clearance backlog of more than 600,000 people through a current, according to the DSS.

The average prospective Defence Department employee waits one year due to delays from a system that involves mailing questionnaires to former places of employment, waiting for a response, and scanning the returned paper document into a mainframe database.

Authorities also express that, although this framework is outdated, it only reveals insight into the working past of a person, yet it requires a sign of future conduct. It includes collecting the computerized impression of a person, or web action, and then matching that with other information about the individual that the office has.

Since what we do online provides an insight into our behaviour they hope it will be a full snapshot the person. Using machine learning algorithms to derive insights, the pilot seeks a much fuller spectrum of digital information and then combines it with other data within the Defense Department.

They say that for the privilege and power that comes with holding a secret or top-secret clearance, it’s a ‘tradeoff’. When you are seeking a job with highly sensitive national secrets, you agree to give up a lot of information about yourself, said the researchers leading the programme.

‘Once fully built, it will look at the bulk of the cyber data you generate, ‘ said Mark Nehmer, DSS’s National Background Investigative Services Technical Director of Research and Development and Technology Transfer. ‘ It’s a date time stamp based on IP. There is no name associated with it ; you have to go to a different set of logs in order to marry those two things up.’

The data from web activity will join data from what’s called ‘continuous evaluation’ a system which monitors life events related to clearance holders. Examples of this are getting divorced or married, entering into a lot of debt, tax returns, arrests, and sudden foreign travel.

Mr Nehmer said that the inevitable objective is a framework that can detect looming insider wrongdoing as well as unquestionably progressively private conditions of ‘pre-wrongdoing’. ‘We can start to include whether the action that the individual is creating is expanding, diminishing, or remaining inside a genuinely regularizing extent,’ he said.

‘Fundamentally, we’re there to look for micro behavioral changes that might indicate a person’s interest in, or disinterestedness in, continuing their affiliation with the Department of Defense or discontinuing their affiliation with life. ‘ he said. Mr Nehmer insists that the goal is not to slap people’s cuffs, but to reveal changes before it becomes necessary to punish.

The system is just a concept at the moment but poses questions about how much our employers should have access to our data. DSS experts say that as with a lot of military tech, companies in the future may want to implement something similar, creating a new norm for employee monitoring.

When asked how likely it is for this type of AI surveillance is to be used by corporate employers, Griff Ferris, from Big Brother Watch, said that not enough is known about this type of AI employee surveillance being used in the UK.

However, he did refer us to a Trade Unions Congress workplace monitoring report that claimed that over 56 per cent think it’s likely that they’re being monitored at work.

Around 70 per cent thought that surveillance is likely to become more common in the future and that the government should ‘ensure employers can only monitor their staff for legitimate reasons that protect the interests of workers’.

Recent reports form ex-Tesla employees claim that the company was surveilling staff at the Gigafactory outside Reno, Nevada to eavesdrop on the personal cellphones of employees while at work. Tesla have denied the claims.

LEAVE A REPLY

Please enter your comment!
Please enter your name here