OpenLab - Open digital platform


OpenLab is an architectonically modern space equipped with top technologies designed for all students. It's unique location directly in an open lobby of the department enables each interested party to try the created solutions and at the same time to get feedback for their own solutions. OpenLab is an example of a lab created around customers — in our case they are students, workers and visitors.

OpenLab is also a platform on which we implement teaching via ongoing challenges and competitions. In this space we also test and develop our virtual assistent Ola. Join our challenge and help us to teach Ola more!

OpenLab enables verification of intelligent space concepts and it was built with the goal of the development and research of modern digital technologies including internet of things (IoT), usability and UX evaluation, digital art and artificial intelligence. Interconnected multifunction sensors, smart meters, projection planes, large-scale LCD displays, microphones, speakers and recording cameras enable unlimited possibilities to create interactive multimedia applications. OpenLab was built with the support of T-Systems and the KEGA č. 053TUKE-4/2019 project "Výučba softvérového inžinierstva prostredníctvom sústavných výziev a súťaží".

Come help us build OpenLab and boost your creativity!

If you're interested, please contact Dr. Dominik Lakatoš.

What you can help with:

  • intelligent lighting
  • dynamic information screens
  • interactive presentations - image, sound, light
  • image and sound processing
  • games for the visitors - controlled by a mobile device, movement, gestures, sight, voice
  • collection, storage and analysis of sensor data
  • safety and security
  • device integration - controlling sliding doors, window tinting, employee card entry
  • administrative console
  • smart sensors and devices
  • Ola virtual assistent
  • interactive KPI logo
  • and sure you have other ideas

OpenLab iný pohľad

What is currently there?

Input devices

  • 6 multifunction sensors (temperature, humidity, pressure, sound, light, tremors)
  • measuring consumption of the whole space (electric current, voltage, power)
  • 20 (FullHD and more) cameras with recording and automatic night IR light
  • 2 spacial microphones (with an example control in Slovak language)
  • 6 RaspberryPi in the ceiling with WiFi and Bluetooth, e.g.: currently we can detect known devices using BT
  • 2 Kinect One sensors placed under the 3x3 and 2x2 panel
  • we can inspect the state of any of the devices such as TV, projector
  • we can automatically control the window tinting walls along with the information about the position of the tinting on the specific window
  • we also plan to automatically control the doors on both OpenLab entry points and employee card entry along with the information about the door states

Output devices

  • large display panel consisting of 9 TV (3x3) with an overall resolution of 6K
  • small display panel consisting of 4 TV (2x2) with an overall resolution of 4K
  • 5 vertical FullHD panels
  • 4 projectors - each 1920x1200
  • 1 touch FullHD display
  • 97 lights, each controlled individually for all 4 color components (RGBW – red, green, blue, white)
  • 8 speakers, each can be controlled individually (needs more work in software, currently we have stereo sound, i.e. 4+4)
  • an interactive department logo with 8 independent RGB lighting LED segments
  • 4 LED reflectors above the stairs - controlling needs more work

Current functional usage scenarios

  • we can play a video, web (URL) to any display device (does not work automatically on all, sometimes we need to open the browser manually but we work on the automatization)
  • automatic lighting system where the light intensity is autommatically changed based on the room environment
  • voice processing and voice control
  • automatic detection of people via registered BT devices
  • automatic voice synthesis in Slovak language based on the given text
  • automatic tasks in the preset time (turning the displays on in the morning, turning the lights off if they remained on, turning the light on in the morning automatically and change the intensity automatically)

Detailed description of HW components

  • 13x Samsung Smart Signage PH49F
  • 5x NEC TV
  • 3x projector Epson EB2255
  • 1x projector Epson EB2247
  • 12x camera Hikvision DS-2CD2522FWD-IS
  • 2x camera Hikvision DS-2CD2552F-I
  • 3x camera Hikvision DS-2CD2942F
  • 2x microphone Audix M55W
  • 1x Synology disk field RS2416+ (72TB) with RAID6 to store the camera recoding and data
  • 3x control PCs for controlling the output devices with Win10
    • 1.PC controls the large panel 3x3 via GeForce GTX 1060 and the sound input/output
    • 2.PC controls the 5 vertical displays via Matrox C680
    • 3.PC controls the 2x2 display panel and 3 projectors via GeForce GTX 1060
  • 1x control mini PC (Intel NUC) into the output Kinect device next to the 2x2 panel
  • 10x RaspberryPi, each with 3 custom boards with the PCA9685 PWM chip to control the lights + 500W 24V power source for each
  • 6x RaspberryPi with the connected sensors (described above) via USB, powered by a standard RP adapter
  • 2x Xbox One Kinect sensor - connected via USB into the control PC for the 3x3 panel and into the dedicated control Intel NUC PC next to the 2x2 panel
  • 1x network controlled sound mix (Soundcraft Ui-24R) for connecting the microphones and speakers
  • 1x 8 channel amplifier (Monacor STA-850D)
  • 8x speakers Electro Voice EVID 4.2
  • 1x 24port 1Gbit switch Zyxel GS1920-24HP with the PoE support (for the cameras)
  • 2x 24port Cisco 100Mbit switch
  • 1x server HP DL380 E5-2620v4 32GB RAM 3x300GB HDD with virtualization

Individual sensors and computers intercommunicated via MQTT messages.

This space is ready for adding new devices via free power sockets and free ethernet cables.

API documentation is accessible for all students of TUKE after singing in on this wiki page