Please help me: Sending String from PyCharm via usb to NXT, which is programmed by Bricx CC #128675
Replies: 3 comments 2 replies
-
I think this might help you 1. Set Up Serial Communication in PyCharmFirst, you'll need to establish a serial communication link between your computer and the NXT. For this, you can use the Install
|
Beta Was this translation helpful? Give feedback.
-
For your project, you'll need to set up USB communication between your Python program (which uses Mediapipe for gesture recognition) and your NXT brick (which will be controlled using the Bricx Command Center). Since direct USB communication with NXT using Bricx Command Center is challenging, we can use a workaround with Python to control the NXT motors via USB. import nxt.locator
from nxt.motor import Motor, PORT_A, PORT_B, PORT_C
import time
def send_command(gesture):
# Connect to NXT brick
brick = nxt.locator.find_one_brick()
m_thumb = Motor(brick, PORT_A)
m_index = Motor(brick, PORT_B)
m_others = Motor(brick, PORT_C)
if gesture == "rock":
# Make the opposite gesture (paper)
m_thumb.turn(100, 180) # Adjust rotation angle as needed
m_index.turn(100, 180)
m_others.turn(100, 180)
elif gesture == "paper":
# Make the opposite gesture (scissors)
m_thumb.turn(100, 90) # Adjust rotation angle as needed
m_index.turn(100, 90)
m_others.turn(100, 90)
elif gesture == "scissors":
# Make the opposite gesture (rock)
m_thumb.turn(100, 360) # Adjust rotation angle as needed
m_index.turn(100, 360)
m_others.turn(100, 360)
else:
print("Unknown gesture")
# Example usage
gesture = "rock" # Replace this with the gesture recognized by Mediapipe
send_command(gesture) Integrate the Mediapipe hand gesture recognition code with the import mediapipe as mp
import cv2
# Initialize Mediapipe Hand
mp_hands = mp.solutions.hands
hands = mp_hands.Hands()
mp_drawing = mp.solutions.drawing_utils
def get_gesture():
cap = cv2.VideoCapture(0)
while cap.isOpened():
ret, frame = cap.read()
if not ret:
break
# Convert the BGR image to RGB.
image = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
image.flags.writeable = False
results = hands.process(image)
image.flags.writeable = True
image = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
if results.multi_hand_landmarks:
for hand_landmarks in results.multi_hand_landmarks:
mp_drawing.draw_landmarks(image, hand_landmarks, mp_hands.HAND_CONNECTIONS)
# Analyze hand_landmarks to determine the gesture
gesture = "rock" # Placeholder: replace with actual gesture detection logic
return gesture
cv2.imshow('Hand Tracking', image)
if cv2.waitKey(5) & 0xFF == 27:
break
cap.release()
return "none"
# Main loop
while True:
gesture = get_gesture()
if gesture != "none":
send_command(gesture)
|
Beta Was this translation helpful? Give feedback.
-
Hi Your project sounds really interesting! For receiving a string from PyCharm to NXT using Bricx Command Center, you'll need to use the Serial commands in NXC (Not eXactly C) to handle the communication over USB. Here's a basic idea of what you could do: Set up the serial connection in PyCharm: Use the pyserial library to send data via USB. Use NXC's Serial commands to read the string and control the motors accordingly. c
} python Set up the serial connectionser = serial.Serial('COM3', 9600) # Adjust COM port as needed gesture = "rock" # Example gesture |
Beta Was this translation helpful? Give feedback.
-
Body
Hello guys,I am new here and hope for some help.
So, I am doing a school project right now and we are working with Lego Mindstorms NXT and write the program with the Bricx Command Center. I know that this is pretty old but maybe some of you have tips or ideas.
So my idea was to create the game rock paper scissors but in a way the player never wins. To do that I worked with the Mediapipe Hand gesture recognition I am running in PyCharm which works great. My webcam analyses the hand gesture. But we have also to work with the Bricx Command center which works like this: i built a lego hand and the thumb, index finger and the rest of the fingers are connected to 3 motors. With those I can make the opposite gesture to the gesture the player made. So when the programm analyses the players gesture I want it to send a string via usb to the NXT. The nxt knows the string and makes the corresponding gesture. But thats exactly my problem because I also need a program written in the Bricx Command Center which can receive the string I send from PyCharm via usb. Are there any ideas or solutions how i could do that? It would be very helpful. I tried many things also asking ChatGPT but as we all know this isnt often that helpful.
Guidelines
Beta Was this translation helpful? Give feedback.
All reactions