4

I have been looking into doing some cluster computing on several raspberry pis. Is it possible to use any kind of GPU acceleration on a pi?

TheDoctor
  • 191
  • 1
  • 8

2 Answers2

2

Apparently Broadcom will never create an openCL implemention for the GPU, so the short answer is no, unless you want to try some tricks with shaders/OGL as suggested in that link.

Looks like someone's implemented CUDA emulation on the pi, but of course that won't really be using the GPU on RPi. Instead, it is pushing processing to a remote x86 machine with GPU.

Sergey Vlasov
  • 163
  • 1
  • 4
goldilocks
  • 58,859
  • 17
  • 112
  • 227
0

Pete Warden, one of the developers behind TensorFlow, has written a great deal on the Pi’s GPU support. Google is, as they say, your best friend. My earlier answer Raspberry Pi 4 GPU math may also be useful.

  • While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From Review – Chenmunka Jun 28 '23 at 12:49