But the thinking I did today about the algorithm has it expanding into a neural network type of thing. I don't feel like going into the whole algorithm right now, but I'll just say that I'm thinking that an algorithm that has 6 parameters (and actually, maybe those 6 parameters would have different values when the possible word list is at different sizes, so there might actually be something like 30 parameters), and a neural network (if I understand it correctly, as I've never actually implemented a neural network) would be a good way to find (close to) optimal values for those parameters. And then, of course, there's look-ahead, like the chess-playing programs use. Combine those two approaches, and we're talking some serious CPU cycles, I think. And lots of iterations in order to find the (close to) optimal parameter values.

I don't imagine it will come as much surprise that I'm currently re-reading

*Godel, Escher, Bach*. I have another Hofstadter book that I haven't read,

*Fluid Concepts and Creative Analogies*, and I think I'm going to read that next. And then I'll probably get his

*Le Ton beau de Marot*, which he talks about in the preface to the 20th anniversary edition of

*GEB*. And probably his collection

*Metamagical Themas*after that. Yes, I tend to get obsessed by an author, but it's nice that this one will also get me thinking more about creative things (programming and writing) instead of just escaping into a story.

## Error

You must follow the Privacy Policy and Google Terms of use.