Interaction between humans and computers is an area of research that comes into focus in the late 1980s. Human hand simulation is an issue that obtain remarkable research in the past but now from the previous few years’ computational abilities and camera performance are improved quickly. So that human hand simulation is not limited to paper or some digital surfaces, it is extended to the 3 dimensions [14].
Robotic arms are controlled by a controller in most of the systems. Atmega32 microcontroller processing unit has been used to interface the hand gesture input with the robotic arm. The robotic arm can move in 2D or 3D depending on the system design parameters [15] [16]. A GUI was designed and implemented to provide the process of generating hand signals and movement. Controlling the virtual hand, recording hand images, and creating hand gestures and animations were the major functions of this GUI [17]. This work was increased by producing a virtual environment. This is for the hand models and developing the effective operations in such a way that this GUI should play the basic role of an interactive interface between the virtual hand and its users and this type of interface consists of a hand simulation.
Real-time hand gesture recognition is used to simulate human hand movement [18] [19]. They used the camera for capture the Images passing through the different phases. The first phase is the acquisition phase. The keyboard or any pointing device will be not accepted in this procedure, so they focused on smart interfaces. The basic idea was that time is expanding the installation in homes, play stations by capturing devices. They were very careful about the design of the processing system [20].
In computer games, hand gestures are used to control the movement and orientation of the game object. Hand gestures and robotic arms can be used in computers and games. In-play station, an eye toy camera is used which captures the hand gesture motion.
Interaction between
humans
and computers is an area of research that
comes
into
focus in
the late 1980s.
Human
hand
simulation is an issue that obtain remarkable research in the past
but
now
from the previous few years’ computational abilities and camera performance are
improved
quickly
.
So
that
human
hand
simulation is not limited to paper or
some
digital surfaces, it
is extended
to the 3 dimensions
[14].
Robotic
arms
are controlled
by a controller in most of the systems. Atmega32 microcontroller processing unit has been
used
to interface the
hand
gesture
input with the
robotic
arm
. The
robotic
arm
can
move
in 2D or 3D depending on the system design parameters [15] [16]. A GUI
was designed
and implemented to provide the process of generating
hand
signals and movement. Controlling the virtual
hand
, recording
hand
images, and creating
hand
gestures
and animations were the major functions of this GUI [17]. This work
was increased
by producing a virtual environment. This is for the
hand
models and developing the effective operations in such a way that this GUI should play the basic role of an interactive interface between the virtual
hand
and its users and this type of interface consists of a
hand
simulation.
Real-time
hand
gesture
recognition is
used
to simulate
human
hand
movement [18] [19]. They
used
the camera for capture the Images passing through the
different
phases. The
first
phase is the acquisition phase. The keyboard or any pointing device will be not
accepted
in this procedure,
so
they focused on smart interfaces.
The
basic
idea
was that time is expanding the installation in homes, play stations by capturing devices. They were
very
careful about the design of the processing system [20].
In computer games,
hand
gestures
are
used
to control the movement and orientation of the game object.
Hand
gestures
and
robotic
arms can be
used
in computers and games. In-play station, an eye toy camera is
used
which captures the
hand
gesture
motion.