Download the open source Bot Libre Community Edition and install Bot Libre on your own server
Bugs and Features : capitalisation .not. Capitalisation

RE: capitalisation .not. Capitalisation

by wrapper posted Oct 29 2014, 6:47

RE: capitalisation .not. Capitalisation

A: Yes, formulas current capitalize the first work of a sentence. I will look into having an option to change this.


Thanks for that, it is at the word comparison level that it needs to be more fuzzy. The "Do I capitalise" should be processed as an action on that task.

I have though a lot about which further ability would most enhance the bots. I would also like to say how impressed I am by the progress you have made, especially in giving the bots abilities to parse web pages, learn flexibly and ease of human interaction. So it is important to be as efficient as possible.

I believe your bots have a good basis "instinctive action & learning ability" to use as a see for a much more complex interaction and deeper learning capability, without much further programming. Those skills can be self learned.

I can see now, you have a (very) lot of the work done, but limited in flexibility and hard coded.

Humans look at a whole word when comparing words, on "how alike they are".

An additional, learn able ability would be for bot to have a further parameter it can measure, when it can't identify a word. This would give the bot the closest word (it has available), that also has the closest amount of "the same letters", "letters in the same position", number of letters" or other relevant parameter.

From : FTC Discussion

The learned ability could then be applied else where for instance spelling error detectors. optomise is ok, but type optamise and it will not find it, however most of the letters are the same!

Notice, with this first simple skill, the bot out performs any current spell checker, and can ask for further "functions" to try (test, reduce)

These could be added generically, with the formula (semi-self learned / or compile self learned algorithm, associated with the parameter.)

The bot would learn a weighting, for the parameter. In the fist case this would be a single global weighting, but that could be stored in the talk decision tree.

For instance "a mood" (generically created with same parameter handling template), would be associated with the weightings of the parameter at that point. The mood could be reduced during sleep (data reduction) by data reduction of the number of the parameters needed to give the best response to the current goals (again generically the same)

This will give 2 major advantages, with just the single word recognition parameter of number of letters in the correct position, the amount of training will half. It will not matter how you hard code the typing afterwards (you are the bots type writer, it is a limitation of the system they need a tool to over come, a self learned tool.)


Id: 481872
Posted: Oct 29 2014, 6:47
Replies: 0
Views: 1887, today: 1, week: 3, month: 6
0 0 0.0/5