Alex Hernandez (writer)

Alex Hernandez is a Cuban-American science fiction writer based in South Florida. The first of his extensive Cuban family to be born in the United States, Hernandez writes in a genre of his own making, which he calls Transhuman Mambo. According to Hernandez this neologism is based on the popular coupling of a scientific term with a musical form, which describes the combination of his love of science fiction with the Cuban culture of his upbringing. Influenced as a child by the work of Isaac Asimov, Hernandez connected in a personal way to this immigrant whose first language was not English. Discovering the novels of Octavia E. Butler while in college had an profound impact on his writing. Hernandez got his start writing indie webcomics in the early 2000s, he is known for his work on Jenny Everywhere. He has an extensive list of creator-owned work such as Eleggua, Thoth Boy, Children of Mars. Whilst working as an administrator of the Miami Dade College Library, Hernandez has published a number of short stories in science fiction publications, including “A Thing with Soft Bonds”, included in Near Kin: A Collection of Words and Art Inspired by Octavia E. Butler and, nominated for a Pushcart Prize.

Most his story “Caridad” was included in Latin@ Rising An Anthology of Latin@ Science Fiction and Fantasy, Wings Press, 2017. Hernandez's short stories have appeared in the popular and long-running Man-Kzin Wars series created by Larry Niven published by Baen Books. Others of his stories have been published at, The Colored Lens, Interstellar Fiction. Hernandez is a third cousin of Orlando Ortega-Medina, the author of Jerusalem Ablaze: Stories of Love and Other Obsessions. Tooth and Talon "Bound for the Promised Land," published in Man-Kzin Wars XIII, Baen Books, 2012. "At the Gates," published in Man-Kzin Wars XIII, Baen Books, 2012. "Beasts on the Shore of Light," The Colored Lens, Issue #3, 2012. "Murder of Crows,", 2012. "Tread Lightly," Interstellar Fiction, 2013. "Lions on the Beach," published in Man-Kzin Wars XIV, Baen Books, 2013. "A Thing of Soft Bonds," published in Near Kin: A Collection of Words and Art Inspired by Octavia Estelle Butler, Sybaritic Press, 2014. "The Properties of Water," Bastion Science Fiction Magazine, Issue #4, 2014.

"Of Radiation and Reunion," Whiteside Review, 2015. "The Jicotea Princess," Three-lobed Burning Eye, Issue #28, 2016. "Caridad," published in Latin@ Rising: An Anthology of Latin@ Science Fiction and Fantasy, Wings Press, 2017. "Cien Mil Soles," published in Multiverse – an international anthology of science fiction poetry, The New Curiosity Shop, 2018

Incremental decision tree

An incremental decision tree algorithm is an online machine learning algorithm that outputs a decision tree. Many decision tree methods, such as C4.5, construct a tree using a complete dataset. Incremental decision tree methods allow an existing tree to be updated using only new individual data instances, without having to re-process past instances; this may be useful in situations where the entire dataset is not available when the tree is updated, the original data set is too large to process or the characteristics of the data change over time. On-line learning Data streams Concept drift Data which can be modeled well using a hierarchical model. Systems where a user-interpretable output is desired. Here is a short list of incremental decision tree methods, organized by their parent algorithms. CART is a nonincremental decision tree inducer for both regression problems. Developed in the mathematics and statistics communities. CART traces its roots to AID incremental CART Crawford modified CART to incorporate data incrementally.

ID3 and C4.5 were developed by Quinlan and have roots in Hunt's Concept Learning System The ID3 family of tree inducers was developed in the engineering and computer science communities. ID3' was suggested by Fisher, it was a brute-force method to make ID3 incremental. ID4 could incorporate data incrementally. However, certain concepts were unlearnable, because ID4 discards subtrees when a new test is chosen for a node. ID5 didn't discard subtrees, but did not guarantee that it would produce the same tree as ID3. ID5R output the same tree as ID3 for a dataset regardless of the incremental training order; this was accomplished by recursively updating the tree's subnodes. It did not handle multiclass classification tasks, or missing values. ID6MDL an extended version of the ID3 or ID5R algorithms. ITI is an efficient method for incrementally inducing decision trees; the same tree is produced for a dataset regardless of the data's presentation order, or whether the tree is induced incrementally or non incrementally.

It can accommodate numeric variables, multiclass tasks, missing values. Code is available on the web. Note: ID6NB is not incremental. There were several incremental concept learning systems that did not build decision trees, but which predated and influenced the development of the earliest incremental decision tree learners, notably ID4. Notable among these was Schlimmer and Granger's STAGGER, which learned disjunctive concepts incrementally. STAGGER was developed to examine concepts. Prior to STAGGER, Michalski and Larson investigated an incremental variant of AQ, a supervised system for learning concepts in disjunctive normal form. Experience with these earlier systems and others, to include incremental tree-structured unsupervised learning, contributed to a conceptual framework for evaluating incremental decision tree learners and incremental concept learning along four dimensions that reflect the inherent tradeoffs between learning cost and quality: cost of knowledge base update, the number of observations that are required to converge on a knowledge base with given characteristics, the total effort that a system exerts, the quality of the final knowledge base.

Some of the historical context in which incremental decision tree learners emerged is given in Fisher and Schlimmer, which expands on the four factor framework, used to evaluate and design incremental learning systems. Fast Decision Trees learner reduces training time for large incremental data sets by subsampling the incoming data stream. VFDT CVFDT can adapt by using a sliding window on incoming data. Old data outside the window is forgotten. VFDTc extends VFDT for continuous data, concept drift, application of Naive Bayes classifiers in the leaves. VFML is a toolkit and available on the web.. It was developed by the creators of VFDT and CVFDT; the Extremely Fast Decision Tree learner is statistically more powerful than VFDT, allowing it to learn more detailed trees from less data. It differs from VFDT in the method for deciding. VFDT waits. In contrast, EFDT splits as soon as it is confident that the best available branch is better than the current alternative; the current alternative is no branch.

This allows EFDT to insert branches much more than VFDT. During incremental learning this means that EFDT can deploy useful trees much sooner than VFDT. However, the new branch selection method increases the likelihood of selecting a suboptimal branch. In consequence, EFDT keeps monitoring the performance of all branches and will replace a branch as soon as it is confident there is a better alternative. OLIN IOLIN - based on Info-Fuzzy Network Concept drift Decision tree Machine Learning Online learning ITI code. Http:// VFML code. Http://