Reply – Re: JInput integration in NEWT?
Your Name
or Cancel
In Reply To
Re: JInput integration in NEWT?
— by Sven Gothel Sven Gothel
On Thursday, September 08, 2011 04:47:34 PM gouessej [via jogamp] wrote:

> Hi
> JInput is an API allowing to manipulate controllers (pads, joysticks,
> etc...), mouses and keyboards. I would like to use it but under Windows, it
> relies on DirectInput which is almost no more maintained by Microsoft which
> prefers working on XInput.
> You can find its source code here:
> Would someone need such an API in NEWT? I would like to support controllers
> in my game.

First of all, yes - extending input facilities are very essential.

In a general manner we added 'multitouch' a while ago (proposals)

See the current 'layout' of MouseEvent:;a=blob;f=src/newt/classes/com/jogamp/newt/event/;h=62a8941d7753482932a03d7f4176e3b0f15ab004;hb=444eaa259116f7985711164512607ad46015fa4b

This shall demonstrate the generic nature of how we may add things IMHO.

So there are at least 2 ways of doing things IMHO
  1 - integrate input devices into NEWT input event handling,
      the implementation may be 'native' or using a 3rd party API layer (jinput, ..)

  2 - use a 3rd party API layer

In regards to (1), I could see the joystick/gamepad input device being mapped
as a mouse pointer with many buttons, since we now support multiple pointers.
It's implementation may be native or we could use a (well supported) 3rd party
Java implementation. I guess the problem here is the wording 'well supported' :)

How hard can it be to receive the joystick/gamepad data ?
If one could provide some native code snippets, I would be more than glad
to funnel it in the NEWT event dispatching mechanism.
IMHO these events will just being passed
to the NEWT window owning the input focus, done.


In regards to (2) and also for a more broad reflection of input mechanisms
we may need to review the API's you have mentioned.

> What are the viable alternatives to DirectInput/XInput on Windows?
> StreamInput?
> In my humble opinion, StreamInput is the only possibility, the traditional
> Windows event loop might not be fast enough for this and... I prefer relying
> on open source APIs ;)

Well, IMHO streaminput is a novel academic disccusion of semantic input facilities,
ie. face recognition, geometric processing, etc.

OFC, it will become a target one day and maybe the time is right
to jump on the train and participate and generating a binding.
Such binding may require OpenCL and OpenCV. The latter is would be a good start,
being implemented partially with OpenCL. AFAIK somebody did exactly this somewhere
(forgot the URL).

And then again, somehow those input facilities may need to end up
in events being send via the NEWT event dispatcher.
We could easily use generic events for such purpose, where the receiver will
cast it to the expected tagged type, eg. a StreamInput.FaceShowsUp event :)
This will be sweet thing (coming some time soon) for sure.

> Would it be difficult to adapt JInput to be built with GlueGen?
> Is the use of JInput as a starting point a good idea whereas it does not use
> StreamInput?

For sure all good ideas.

IMHO the fasted way to just get joystick and gamepad going
is the above mentioned one (1).
So anybody pls .. I would need your expertise for such native input APIs
and maybe a few native code examples. THANK YOU.

> Best regards.