Politeness doesn’t really amount to much when you’re programmed to get from point A to point B. But if robots are going to play an increased role in human society, questions arise around how precisely they’ll get along with the rest of us.

“Robots will live in our world soon enough and they really need to learn how to communicate with us on human terms,” MIT CSAIL research scientist Boris Katz said in a statement tied to a new research paper. “They need to understand when it is time for them to help and when it is time for them to see what they can do to prevent something from happening.”

The team calls the paper, “the first very serious attempt for understanding what it means for humans and machines to interact socially.” The validity of such a claim may be up for dispute; the problem it’s attempting to solve in a very early stage is no doubt one roboticists will increasingly consider as robots begin to play an outsized role in our lives.

Researchers conducted tests in a simulated environment, to develop what they deemed “realistic and predictable” interactions between robots. In the simulation, one robot watches another perform a task, attempts to determine the goal and then either attempts to help or hamper it in that task.

“We have opened a new mathematical framework for how you model social interaction between two agents,” fellow project lead Ravi Tejwani said in a statement. “If you are a robot, and you want to go to location X, and I am another robot and I see that you are trying to go to location X, I can cooperate by helping you get to location X faster. That might mean moving X closer to you, finding another better X, or taking whatever action you had to take at X. Our formulation allows the plan to discover the ‘how’; we specify the ‘what’ in terms of what social interactions mean mathematically.”

The model is currently a relatively simple 2D simulation. The team is working to move toward a 3D version, while adding a neural network-based robot planner to increase the speed with which the robots learn from these actions.