This is a copy of the Java documentation for libGDX. It is not finished being ported.
The main input devices SharpGDX supports are the mouse on the desktop/browser, touch screens on Android and keyboards. Let’s review how SharpGDX abstracts those.
Keyboard
Keyboards signal user input by generating events for pressing and releasing a key. Each event carries with it a key-code that identifies the key that was pressed/released. These key-codes differ from platform to platform. SharpGDX tries to hide this fact by providing its own key-code table, see the Keys (source) class. You can query which keys are currently being pressed via Polling.
Key-codes alone do not give us information about which character the user actually entered. This information is often derived from the state of multiple keys, e.g. the character ‘A’ is generated by the keys ‘a’ and ‘shift’ being pressed simultaneously. In general, deriving characters from the keyboard’s state (which keys are down) is non-trivial. Thankfully, the operating system usually has a means to hook up an event listener that not only reports key-code events (key pressed/key released), but also characters. SharpGDX uses this mechanism under the hood to provide you with character information. See Event Handling.
As most Android and iOS devices lack a physical keyboard, we can use Gdx.input.setOnscreenKeyboardVisible(true)
to bring up the on-screen keyboard. It is also possible to specify which type of keyboard to display by passing in an OnscreenKeyboardType
.
Mouse & Touch
Mouse and touch input allow the user to point at things on the screen. Both input mechanisms report the location of interaction as 2D coordinates relative to the upper left corner of the screen, with the positive x-axis pointing to the right and the y-axis pointing downward.
Mouse input comes with additional information, namely which button was pressed. Most mice feature a left and a right mouse button as well as a middle mouse button. In addition, there’s often a scroll wheel which can be used for zooming or scrolling in many applications.
Touch input does not have the notion of buttons and is complicated by the fact that multiple fingers might be tracked depending on the hardware. First generation Android phones only supported single-touch. Starting with phones like the Motorola Droid, multi-touch became a standard feature on most Android phones.
Note that touch can be implemented quite differently on different devices. This can affect how pointer indexes are specified and released and when touch events are fired. Be sure to test your control scheme on as many devices as possible. There are also many input test apps available in the market which can help determine how a particular device reports touch and aid in designing a control scheme that works best across a range of devices.
SharpGDX abstracts unified handling of mouse and touch input. We view mouse input as a specialized form of touch input. Only a single finger is tracked, and in addition to coordinates we also report which buttons were pressed. For touch input we support tracking multiple fingers (pointers) and report the left mouse button for all events.
Note that on Android the coordinate system is either relative to portrait or landscape mode, depending on what you set for your application.
Mouse and touch input can either be polled or processed via Event Handling
Touch Point
To get correct world position of touch point or mouse cursor it is necessary to unproject the raw screen position coordinates with camera that operate in world space. Below is a self-contained example of doing just that.
public class SimplerTouchTest extends ApplicationAdapter implements InputProcessor {
// we will use 32px/unit in world
public final static float SCALE = 32f;
public final static float INV_SCALE = 1.f/SCALE;
// this is our "target" resolution, note that the window can be any size, it is not bound to this one
public final static float VP_WIDTH = 1280 * INV_SCALE;
public final static float VP_HEIGHT = 720 * INV_SCALE;
private OrthographicCamera camera;
private ExtendViewport viewport;
private ShapeRenderer shapes;
@Override public void create () {
camera = new OrthographicCamera();
// pick a viewport that suits your thing, ExtendViewport is a good start
viewport = new ExtendViewport(VP_WIDTH, VP_HEIGHT, camera);
// ShapeRenderer so we can see our touch point
shapes = new ShapeRenderer();
Gdx.input.setInputProcessor(this);
}
@Override public void render () {
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
shapes.setProjectionMatrix(camera.combined);
shapes.begin(ShapeRenderer.ShapeType.Filled);
shapes.circle(tp.x, tp.y, 0.25f, 16);
shapes.end();
}
Vector3 tp = new Vector3();
boolean dragging;
@Override public boolean mouseMoved (int screenX, int screenY) {
// we can also handle mouse movement without anything pressed
// camera.unproject(tp.set(screenX, screenY, 0));
return false;
}
@Override public boolean touchDown (int screenX, int screenY, int pointer, int button) {
// ignore if its not left mouse button or first touch pointer
if (button != Input.Buttons.LEFT || pointer > 0) return false;
camera.unproject(tp.set(screenX, screenY, 0));
dragging = true;
return true;
}
@Override public boolean touchDragged (int screenX, int screenY, int pointer) {
if (!dragging) return false;
camera.unproject(tp.set(screenX, screenY, 0));
return true;
}
@Override public boolean touchUp (int screenX, int screenY, int pointer, int button) {
if (button != Input.Buttons.LEFT || pointer > 0) return false;
camera.unproject(tp.set(screenX, screenY, 0));
dragging = false;
return true;
}
@Override public void resize (int width, int height) {
// viewport must be updated for it to work properly
viewport.update(width, height, true);
}
@Override public void dispose () {
// disposable stuff must be disposed
shapes.dispose();
}
@Override public boolean keyDown (int keycode) {
return false;
}
@Override public boolean keyUp (int keycode) {
return false;
}
@Override public boolean keyTyped (char character) {
return false;
}
@Override public boolean scrolled (int amount) {
return false;
}
public static void main (String[] args) {
Lwjgl3ApplicationConfiguration config = new Lwjgl3ApplicationConfiguration();
config.setWindowedMode(1280, 720);
new Lwjgl3Application(new SimplerTouchTest(), config);
}
}