0

I made a fully functional touch-keyboard within my WPF application using both XAML and C# (ofc). Currently, every key pressed on the keyboard registers the respective key in the text box; however, if the user touches the TextBox (which is welcomed...I do not want to prevent it) to position the cursor earlier in the TextBox to fix an error they might have typed AND THEN interact with the keys (as expected), the position they requested does not register and the keys continue to be entered on the end.

For example, user types(note the | acting as the cursor)::

Justin is awesome, bt that is because it is his birthday.|

User sees error in the word "but" spelled as "bt" so decides to touch the screen to position the cursor between b and t like so:

Justin is awesome, b|t that is because it is his birthday.

User expects then to touch U on the keyboard and it to be entered where the cursor appears to be...unfortunately (currently) it ends up like this:

Justin is awesome, bt that is because it is his birthday.U|

If I enable the mouse, the same thing happens (so it may not be exclusive to touch). If I plug in a keyboard after touching a position, I can type with said plugged-in keyboard no problem.

The implemented touch keyboard just wont read the position correctly. Looking at this, the action seems to go no where because the Cursor Position goes from the TextBox to a Button (keyboard keys).

Here's a snippet of code for anyone to test (obviously can't provide the full keyboard):

XAML

<Grid>
  <TextBox x:Name="txtBox" />
  <Button x:Name="btnQ" Content="q" Click="btnQ_Click"/>
  <Button x:Name="btnW"   Content="w" Click="btnW_Click"/>
</Grid>

C#

    private bool numberHitSinceLastOperator = false;
    private void HandleKeyboard(string key)
    {

        //Ternary operator
        string valueSoFar = numberHitSinceLastOperator ? txtBox.Text : "";
        string newValue = valueSoFar + key.ToString();
        txtBox.Text = newValue;
        numberHitSinceLastOperator = true;

    }
    private void btnQ_Click(object sender, RoutedEventArgs e)
    {
        HandleKeyboard("Q");
    }

    private void btnW_Click(object sender, RoutedEventArgs e)
    {
        HandleKeyboard("W");
    }
Community
  • 1
  • 1
justinternio
  • 87
  • 2
  • 10
  • Not really an answer to your question, but you can use the [SurfaceSDK](http://msdn.microsoft.com/en-us/library/ff727815.aspx) (originally designed for the Surface and Pixelsense) for touch-enabled UI components. I used them in a wide range of projects and they work on all different types of touch screens with default WPF. – dsfgsho Jun 12 '14 at 23:56
  • @StevenHouben Thanks for the suggestion! I looked into the SDK and will consider using it if I come across other issues as such. :) – justinternio Jun 16 '14 at 17:45

1 Answers1

0

Your problem is this line:

string newValue = valueSoFar + key.ToString();

You are always appending at the end. To append at the right position, you can do something like this:

int oldCaretIndex = txtBox.CaretIndex;
txtBox.Text = txtBox.Text.Insert(text.CaretIndex, "a");
//changing Text resets the caret position to 0
txtBox.CaretIndex = oldCaretIndex + 1;

You could also try simulating key presses, see for example here: How can I programmatically generate keypress events in C#?

Or, like the commenter suggested, try the SurfaceSDK (though I haven't used it myself).

Community
  • 1
  • 1
vesan
  • 3,289
  • 22
  • 35