COS402 Program P6 results for dna sorted by error


View results for: census dna ocr17 ocr49
Resort by: author date error

AuthorDescriptionDate submitted% Error rate
George and KatieRandom Forests implemented using vanilla Decision Trees and customizable depth, tree size, and bootstrap size.Thu Jan 9 20:27:29 20142.3 
S1Random forests with decision trees (500 trees).Sat Jan 11 20:12:48 20142.4 
Aaron DollThis is an implementation of the random forest algorithmFri Jan 10 14:50:20 20142.5 
Aaron DollThis is an implementation of the random forests with m=numAttrs (effectively bagging), 400 treesFri Jan 10 22:58:00 20142.6 
skarpAdaBoost with decision trees as the weak learner (chosen to minimize the entropy, where each tree is restricted to a maximum depth of 5) and 200 rounds of boosting.Tue Jan 14 12:38:06 20142.7 
Aaron DollThis is an implementation of the random forests with m=1, 400 treesFri Jan 10 22:18:09 20142.8 
KiwisAdaBoost with decision trees. Number of iterations: 200. Max depth for tree: 6.Sun Jan 12 15:46:24 20142.8 
TaurielRandomForest w/ DecisionTrees pruned at significance level 0.95Sun Jan 12 15:51:48 20142.8 
anon5An implementation of the AdaBoost algorithm using decision trees with pruning as the learner with 200 rounds of boostingFri Jan 3 20:00:03 20142.9 
Andrew WernerAdaBoost using vanilla decision trees as the weak learnerSat Jan 4 15:01:34 20142.9 
TaurielRandomForest w/ DecisionTreesSat Jan 4 16:30:35 20142.9 
Janie GuAdaBoost algorithm with decision trees as the weak learner (with a random subset of training examples selected each round by resampling).Mon Jan 6 15:26:52 20142.9 
skarpAdaBoost with decision trees as the weak learner (chosen to minimize the entropy, where each tree is restricted to a maximum depth of 4) and 200 rounds of boosting.Tue Jan 14 12:29:29 20142.9 
tenrburritoAdaBoost algorithm with Decision Trees as weak learning algorithmTue Jan 14 16:38:21 20142.9 
S1An implementation of AdaBoost, with pruned decision trees as the weak learner for the algorithm and 500 rounds of boosting.Mon Jan 13 16:46:26 20143.0 
anon5An implementation of the AdaBoost algorithm using vanilla decision trees as the learner with 200 rounds of boostingFri Jan 3 20:15:31 20143.1 
Tiny WingsAdaBoost with decision tree algorithm as weak learner (maximum depth of decision trees = 5, chi-square pruning significancel level = 0.01, # of AdaBoost rounds = 200)Mon Jan 6 05:07:34 20143.1 
TaurielRandomForest w/ DecisionTreesSun Jan 12 15:23:34 20143.1 
dericc, sigatapu200-iteration AdaBoost with Decision TreesMon Jan 13 16:45:48 20143.1 
NYRandom forest with 500 iterationsWed Jan 8 16:53:41 20143.2 
TaurielRandomForest w/ DecisionTreesSun Jan 12 15:40:12 20143.2 
anonAdaBoost (using shallow binary decision trees as weak learner)Sun Jan 5 12:06:02 20143.3 
SkyNet1000-iteration AdaBoost with Decision StumpThu Jan 9 21:22:55 20143.3 
smAdaBoost, using pruned Decision Trees as the weak learner.Thu Jan 9 19:13:02 20143.4 
KiwisAdaBoost with decision trees. Number of iterations: 100. Max depth for tree: 4.Sun Jan 12 16:18:27 20143.4 
smAdaBoost, using pruned Decision Trees as the weak learner.Mon Jan 13 15:35:01 20143.7 
Mr. BlobbyAdaBoost (200 rounds) with decision trees (depth limit of 5)Sun Jan 12 06:13:26 20143.8 
Ravi TandonThis algorithm is the implementation of the bootstrap aggregation algorithm.Mon Dec 23 18:40:45 20134.0 
Aaron DollDecision tree with reduced error pruning.Thu Dec 26 16:45:49 20134.2 
David Hammerbagging (using binary decision trees)Sun Jan 5 12:29:43 20144.2 
JordanAdaboost to create a new feature space, then KNNSun Jan 5 21:17:16 20144.2 
anonbagging (using binary decision trees)Sat Jan 4 17:36:41 20144.4 
bfangBagging with decision treesWed Jan 1 11:30:09 20144.8 
ebp and WafflepocalypseAdaboost on random forests of 30 trees, sampling .65 of the weighted training data with replacement for each hypothesis, 150 rounds of boosting.Sat Jan 11 02:46:53 20144.9 
NYBagged Decision Trees with 500 treesWed Jan 8 16:58:46 20145.0 
George and KatieRandom Forests implemented using vanilla Decision Trees and customizable depth, tree size, and bootstrap size.Wed Jan 8 23:01:13 20145.1 
ebp and WafflepocalypseAdaboost on random forests of 30 trees, sampling .65 of the weighted training data with replacement for each hypothesis, 150 rounds of boosting.Mon Jan 13 17:59:25 20145.3 
LKBagging with AdaBoost that uses decision stumpsTue Jan 14 12:26:28 20145.8 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundTue Jan 14 11:13:03 20145.9 
Andra Constantinescu and Bar ShabtaiAdaBoost on a single layer neural network The neural classifier takes binary input and loops through all training examples to update the weights of each attribute. Number of boosting rounds optimized for data set (here 2)Tue Jan 14 16:23:34 20145.9 
asdfA single perceptron (using a logistic threshold) with a learning rate of 0.001 and 100 epochs of training.Thu Jan 9 23:08:08 20146.0 
Boar CiphersImplements a single-layer neural network with 100 epochs and a 0.001 learning rateTue Jan 14 10:53:40 20146.0 
JSSingle-layer Neural Network, 100 epochs, Learning Rate = 0.001Tue Jan 14 15:33:47 20146.0 
LKAdaBoost using decision stumpFri Jan 3 04:31:18 20146.1 
ebpAdaboost with decision stumps minimizing smoothed weighted training error, 100 rounds of boosting.Mon Jan 6 21:02:21 20146.1 
K.L.AdaBoost run with 1000 iterations.Tue Jan 7 15:33:41 20146.1 
KiwisAdaBoost with decision stumps and 150 iterations.Thu Jan 9 07:10:52 20146.1 
Sunnannasingle-layer feedforward neural net usinglogistic functionMon Jan 13 16:18:07 20146.1 
Andrew WernerAdaBoost using vanilla decision trees as the weak learnerThu Jan 2 22:55:08 20146.2 
LKBagging with AdaBoost that uses decision stumpsMon Jan 6 08:29:25 20146.3 
JgsAn implementation of AdaBoost with vanilla Decision Trees. Rounds of boosting = 150Sun Jan 12 22:08:45 20146.3 
bchouAdaBoost on Binary Decision Stumps. 150 rounds of BoostingFri Jan 3 07:09:15 20146.5 
Anna Ren (aren) and Sunny Xu (ziyangxu)<BR><BR>SunnannaAdaboost using 150 rounds of boosting and decision stumps as a weak learnerThu Jan 9 22:39:29 20146.5 
KiwisAdaBoost with decision stumps and 30 iterations.Fri Jan 10 07:39:58 20146.5 
Mr. BlobbyAdaBoost (150 rounds) with decision stumpsFri Jan 10 20:22:29 20146.5 
Green Gmoney ChoiThis is an implementation of the AdaBoostalgorithm with decision stumps.Sun Jan 12 15:10:36 20146.5 
FannyA voted perceptron algorithm (epoch = 10)Mon Jan 13 20:44:10 20146.5 
RockyAdaBoost algorithm with the decision stumps as the weak learner, T=150Mon Jan 13 21:17:55 20146.5 
R.A.B.Adaboost on decision stumps 150 roundsTue Jan 14 03:15:28 20146.5 
Tiny WingsDecision tree with chi-square pruning (pruning significance level = 0.01)Mon Jan 6 05:05:37 20146.6 
JgsAn implementation of AdaBoost with vanilla Decision Trees. Rounds of boosting = 250Sun Jan 12 20:27:00 20146.6 
sabardA decision tree learning algorithm with chi squared pruning (5%).Tue Jan 14 15:32:46 20146.6 
JamehA learning algorithm using Adaboost along with decision stumps to determine a classifier to use in future test cases. Give a BinaryDataSet and number of rounds for boosting.Tue Jan 7 22:52:25 20146.7 
George and KatieA simple implementation of decision trees as per R&N.Wed Jan 8 21:45:23 20146.7 
BPMI implemented AdaBoost with binary decision stumps and 100 rounds of boosting.Wed Jan 8 22:42:32 20146.7 
SAJEADABOOSTThu Jan 9 15:27:21 20146.7 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 100 rounds of boosting.Mon Jan 13 20:42:40 20146.7 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 100 rounds of boosting.Tue Jan 14 12:12:08 20146.7 
hbAdaBoost, basic decision stumpsTue Jan 14 15:14:42 20146.7 
dlackeyThis is an implementation of AdaBoost that uses 175 rounds of boosting. The weak learning algorithm used is a decision stump that directly minimizes the weighted training error.Wed Jan 8 14:10:54 20146.9 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesMon Jan 13 21:22:25 20146.9 
CCAdaBoost with Decision Stump for 50 rounds.Mon Jan 13 21:31:50 20146.9 
ECNeural NetTue Jan 7 18:40:44 20147.0 
TaurielAdaBoost w/ DecisionTreeSun Jan 12 14:11:25 20147.1 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 200 rounds of boosting.Sat Dec 28 16:13:58 20137.2 
CTTTAdaboost + Decision Stumps (200 rounds).Mon Jan 6 20:44:55 20147.2 
Mickey MouseA implementation of AdaBoost withDecision Stump as the weak learner and 200 rounds of boostingTue Jan 7 15:44:27 20147.2 
lolzAdaboost with decision stumps as the weak learner algorithm (k = 200)Thu Jan 9 00:39:23 20147.2 
SAJEADABOOSTThu Jan 9 15:32:14 20147.2 
Chuck and LarryAdaBooooooost!!! using binary decision stumps with 200 rounds of boostingThu Jan 9 18:04:16 20147.2 
CharliezscAdaboost with Decision StumpsThu Jan 9 18:21:10 20147.2 
WumiAdaBoost with Decision Stump and 200 boostsThu Jan 9 20:52:31 20147.2 
Mr. BlobbyAdaBoost (200 rounds) with decision stumpsThu Jan 9 23:32:26 20147.2 
jabreezyAdaboost with decision stumps (boosted 200 roundsThu Jan 9 23:49:49 20147.2 
hian implementation of AdaBoost woo hooThu Jan 9 23:57:10 20147.2 
JgsAn implementation of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 20Sat Jan 11 13:50:02 20147.2 
MariusAdaBoost using decision stumps as the weak-learning algorithms. It is run for 200 iterations.Sun Jan 12 14:09:03 20147.2 
Hello!AdaBoost with Decision StumpsSun Jan 12 23:21:29 20147.2 
Epic HarborsAdaboost with decision stumps as the weak learner and 250 rounds of boostingThu Jan 9 15:00:12 20147.3 
MacrameAdaboost, decision stumps, 250 roundsTue Jan 14 01:55:44 20147.3 
RockyBagging algorithm with single layer neural network as the weak learnerTue Jan 14 15:01:01 20147.3 
NYPurifies training set for decision tree (pruning alternative)Wed Jan 8 17:03:23 20147.4 
Mike HonchoAdaboost implementationThu Jan 9 21:47:00 20147.4 
JgsAn implementation of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 150Sat Jan 11 13:56:33 20147.4 
bfangBoosting with decision stumps (100 rounds)Fri Jan 3 21:10:37 20147.5 
Shaheed ChaganiAdaBoostMon Jan 13 21:34:07 20147.6 
CCAdaBoost with Decision Stump for 25 rounds.Tue Jan 14 10:51:51 20147.6 
Bob DonderoAdaboost (200 rounds) with weak learner as a decision tree (max depth 5) and chi-squared pruning (1%)Fri Jan 10 23:04:11 20147.7 
Solving From ClassifierThe Naive Bayes algorithm using a binary representation as opposed to a discrete representation.Sat Jan 11 21:42:29 20147.7 
Solving From ClassifierThe Naive Bayes algorithm using a binary representation at times an a discrete representation at other times.Sat Jan 11 21:58:17 20147.7 
sabardA decision tree learning algorithm with chi squared pruning.Tue Jan 14 05:27:30 20147.7 
skarpAdaBoost with decision stumps as the weak learner (chosen to minimize the weighted training error) and 300 rounds of boosting.Tue Jan 14 12:07:17 20147.7 
JgsAn implementaiton of AdaBoost with Decision Stumps that is optimized by only using the best possible decision stump for each attribute. Rounds of boosting = 250Thu Jan 9 23:27:29 20147.9 
Andra Constantinescu and Bar ShabtaiAdaBoost with decision stump as the weak learner.Number of iterations of AdaBoost optimized per example.Tue Jan 14 02:55:45 20147.9 
meAdaboost using decision stumps and 400 rounds of boosting.Thu Jan 9 23:18:51 20148.0 
bfangSingle layer neural network, 80 epochs, alpha=0.01Fri Jan 10 12:43:10 20148.0 
haoyuRandom Forest with Decision TreeFri Dec 27 00:48:21 20138.1 
bfangBoosting with decision stumps and early stoppingSun Dec 29 23:30:13 20138.1 
qshenAn implementation of AdaBoost that uses a weak learner that chooses the decision stump that minimizes the weighted training error and is iterated 500 times.Mon Dec 30 13:58:28 20138.1 
anon5An implementation of the AdaBoost algorithm using decision stumps as the learner with 200 rounds of boostingThu Jan 2 15:28:13 20148.1 
K.L.Single layer neural net, 100 training rounds, learning rate = .01.Tue Jan 7 15:36:37 20148.1 
S1An implementation of AdaBoost, with decision stumps as the weak learner for the algorithm and 500 rounds of boosting.Thu Jan 9 15:22:39 20148.1 
SAJEADABOOSTThu Jan 9 15:35:47 20148.1 
RockyAdaBoost algorithm with the decision stumps as the weak learnerThu Jan 9 20:49:05 20148.1 
0108Adaboost with decision stump as weak learnerSun Jan 12 14:22:07 20148.1 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha = 0.1. Very nice, I like!Thu Jan 9 21:00:47 20148.2 
Mike Honcho 500Adaboost implementationTue Jan 14 12:28:11 20148.3 
WafflepocalypseA random forest classifier with 1001 trees.Thu Jan 9 04:53:45 20148.4 
FannyAn ensemble learning algorithm that consists of AdaBoost using decision stumps as weak learner.Sun Jan 12 04:25:12 20148.5 
SupahakaAdaBoost with Decision Stumps with 500 rounds of boosting.Tue Jan 14 01:05:08 20148.5 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha and epochs optimized for each dataset.Tue Jan 14 02:34:41 20148.5 
JSSingle-layer Neural Network, 200 epochs, Learning Rate = 0.01Thu Jan 9 15:36:09 20148.6 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesMon Jan 13 07:30:13 20148.6 
bcfourA Naive Bayes approach to classification.Sat Jan 4 22:45:40 20148.8 
Solving From ClassifierThe Naive Bayes algorithm executes the maximum-likelihood parameter learning problem and uses the learned parameters (obtained from observed attribute values) to find the maximum-likelihood naive Bayes hypothesis.Tue Jan 7 13:55:18 20148.8 
Mickey MouseAn implementation of Naive BayesTue Jan 7 15:50:35 20148.8 
CaligulaAn implementation of AdaBoost with decision stumps and 800 rounds of boosting.Wed Jan 8 19:46:23 20148.8 
bcfour,jkwokNaive Bayes with standard Laplacian correctionThu Jan 9 13:03:06 20148.8 
dmmckenn_pthorpeImplements Naive Bayes using discretization as opposed to continuous values.Thu Jan 9 21:21:27 20148.8 
Hello!Naive BayesThu Jan 9 23:05:22 20148.8 
akdoteNaive Bayes AlgorithmThu Jan 9 23:46:25 20148.8 
SunnannaNaive Bayes Alogrithm using maximum likelihood estimatorMon Jan 13 18:54:30 20148.8 
vvsprAn implementation of the Naive Bayes AlgorithmTue Dec 31 13:37:34 20138.9 
weezyImplements AdaBoost using decision stumps as a weak learner and running for 1000 rounds of boosting.Wed Jan 8 23:17:50 20149.0 
SkyNet1000-iteration AdaBoost with Decision StumpThu Jan 9 00:20:25 20149.0 
CCAdaBoost with Decision Stump for 1000 rounds.Thu Jan 9 14:40:01 20149.0 
ytterbiumAdaBoost with decision stumps. (1000 rounds)Thu Jan 9 20:00:48 20149.0 
Cam PorterA version of the AdaBoost learning algorithm that uses decision stumps as a weak learning base.Thu Jan 9 22:47:38 20149.0 
dmmckenn_pthorpeImplements Adaboost with 1,000 rounds of boosting with decision stumps as the weak learner.Thu Jan 9 23:32:58 20149.0 
Mr. BlobbyAdaBoost (1000 rounds) with decision stumpsFri Jan 10 20:11:35 20149.0 
R.A.B.Adaboost on decision stumps 1000 roundsTue Jan 14 02:04:14 20149.0 
anonvanilla decision treeSat Jan 4 14:23:53 20149.1 
akdoteNaive Bayes ClassifierThu Jan 9 21:05:57 20149.1 
0108Adaboost with decision stump as weak learnerSun Jan 12 13:35:33 20149.3 
GlennBackpropagation performed on a neural network with 1 hidden layers for 50 iterations. The learning rate was set to 0.01 and the layers (from input to output) contain [ 241 11 1 ] units, including a bias unit for each non-output layer.Tue Jan 14 12:55:42 20149.3 
anon5An implementation of a decision-tree-learning algorithm with pruningFri Jan 3 23:11:54 20149.5 
Andra Constantinescu and Bar ShabtaiVanilla single layer neural network algorithm. Takes binary input and loops through all training examples to update the weights of each attribute. Alpha and epochs optimized for each dataset.Tue Jan 14 02:29:11 20149.6 
JordanThe Adaboost Algorithm with 2000 Decision StumpsSun Jan 5 15:21:07 20149.9 
anon5An implementation of vanilla decision-tree-learningFri Jan 3 23:15:26 201410.0 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundMon Jan 13 16:07:10 201410.0 
NYDecision TreeSun Jan 12 15:34:38 201410.1 
bclamSingle-layer Neural Network - 1000 epochs, Learning rate of 0.05Wed Jan 8 22:16:33 201410.2 
GlennBackpropagation performed on a neural network with 1 hidden layers for 3000 iterations. The learning rate was set to 0.1 and the layers (from input to output) contain [ 241 4 1 ] units, including a bias unit for each non-output layer.Thu Jan 9 23:20:33 201410.2 
CCAdaBoost with Decision Stump for 5000 rounds.Thu Jan 9 14:49:58 201410.3 
CCAdaBoost with Neural Networks as the learner. Uses the percentage with the highest weights of the data to make the hypothesis on a given roundTue Jan 14 11:41:07 201410.3 
SAJEADABOOSTThu Jan 9 15:43:16 201410.4 
SAJEADABOOST 2Thu Jan 9 15:48:58 201410.4 
hbAdaBoost, KNN as week learner, k chosen empiricallyTue Jan 14 16:33:16 201410.4 
Chuck and LarryPerceptron neural networkThu Jan 9 18:00:04 201410.5 
Dr. Steve Brule (For Your Health)Neural Network.Sun Jan 12 19:58:48 201410.5 
ValyaImplements a single layer neural net, much like the algorithm used in W6, with alpha = 0.01, running for 1000 epochs.Mon Jan 13 22:06:10 201410.6 
FannyAn ensemble learning algorithm that consists of AdaBoost using decision stumps as weak learner.Sat Jan 4 06:55:07 201410.7 
Dr. RobertoSingle layer Neural Net run for 100 epochs with a learning value 0f 0.01Thu Jan 9 13:39:04 201410.9 
Katie and GeorgeAn implementation of the (voted) perceptron algorithm run for 25 epochs.Thu Jan 9 20:08:04 201410.9 
CharliezscBagging with Decision Stumps (200 weak learners and 2 percent bootstrap samples)Thu Jan 9 20:49:51 201410.9 
CAPSLOCKA mostly vanilla decision tree. Uses some cool data structures though.Thu Jan 9 22:57:33 201410.9 
Bob DonderoAdaboost (200 rounds) with weak learner as a decision tree (max depth 5) and chi-squared pruning (1%).Thu Jan 9 20:54:23 201411.0 
SupahakaAdaBoost with Decision Stumps with 100000 rounds of boosting.Mon Jan 13 00:38:32 201411.0 
vluuAn attempt at AdaBoost with Naive BayesThu Dec 26 21:36:21 201311.1 
Katie and GeorgeAn implementation of the (voted) perceptron algorithm run for 100 epochs.Mon Jan 6 21:03:13 201411.1 
Dr. RobertoADABoost with 50 rounds of Single Layer Neural Net run for 100 epochs with a varying learning value of around 0.01Thu Jan 9 14:59:30 201411.1 
PandaBearAdaboost on decision stumps, 1000 roundsThu Jan 9 18:11:53 201411.4 
B&YWe use the voted-perceptron algorithm. It runs repeatedly on each training set until it finds a prediction vector which is correct on all examples. We keep track of the survival times for each new prediction vector. These weights help us make a final binary prediction using a weighted majority vote.Thu Jan 9 23:14:49 201411.4 
B&YWe use the voted-perceptron algorithm (running at 1000 epochs). It runs repeatedly on each training set until it finds a prediction vector which is correct on all examples. We keep track of the survival times for each new prediction vector. These weights help us make a final binary prediction using a weighted majority vote.Tue Jan 14 12:25:17 201411.4 
Hello!AdaBoost with Naive Bayes(200)Tue Jan 14 14:19:36 201411.4 
Mickey MouseAn Implementation of Voted Perceptron algorithm with 200 epochsWed Jan 8 15:52:53 201411.5 
T.C.Multi-layered Neural Net, 200 iterations, .1 learning rateThu Jan 9 03:05:39 201411.8 
Catherine Wu and Yan WuAdaBoost using random sampling and Decision TreesWed Jan 8 22:51:13 201412.1 
GlennBackpropagation performed on a neural network with 0 hidden layers for 100 iterations. The learning rate was set to 0.001 and the layers (from input to output) contain [ 241 1 ] units, including a bias unit for each non-output layer.Tue Jan 14 13:06:55 201412.1 
Lil ThugA simple decision tree algorithm with chi-squared pruning.Wed Jan 8 15:12:38 201412.4 
Hello!AdaBoost with Naive BayesTue Jan 7 15:09:21 201412.5 
Dr. RobertoADABoost with 100 rounds of Single Layer Neural Net run for 10 epochs with a varying learning value of around 0.01Thu Jan 9 14:51:08 201412.8 
Mike Honcho 100Adaboost implementationTue Jan 14 12:25:16 201413.3 
Shaheed ChaganiNaive Bayes ClassifierMon Jan 13 14:04:10 201413.7 
Andra Constantinescu and Bar ShabtaiRandom Forest with number of trees, N and M optimized for each dataset!Tue Jan 14 07:12:38 201414.2 
Steve Brule (For Your Health)Decision Tree Learning algorithm implementation.Thu Jan 9 17:28:18 201414.3 
Ameera and DavidDecision Tree Learning algorithm implementation.Sun Jan 12 22:04:52 201414.3 
liltAn implementation of 10-nearest neighborsThu Jan 9 00:23:16 201415.1 
finn&jakeKnn, K=10, Euclidean distance for numeric and standardized distance for discrete variables; majority vote for nearest neighbors.Tue Jan 14 13:22:51 201415.1 
R.A.B.K nearest neighbors with k = 20Thu Jan 9 22:26:05 201416.0 
Linda ZhongBasic decision tree algorithm implementation , no pruning.Sun Jan 12 16:29:38 201416.6 
FannyA voted perceptron algorithm (epoch = 10)Sun Jan 12 22:58:47 201417.7 
CTTTA decision stump weak learner.Mon Jan 6 22:58:43 201417.9 
K.L.Decision StumpsTue Jan 7 15:29:40 201417.9 
Mickey MouseAn implementation of Decision Stump AlgorithmTue Jan 7 15:47:47 201417.9 
SquirtleAn implementation of the vanillia decisionstumps classifierThu Jan 9 14:27:10 201417.9 
CharliezscDecision Stump without boostingThu Jan 9 19:19:26 201417.9 
CharliezscBagging with Decision Stumps (200 weak learners and half bootstrap samples)Thu Jan 9 20:08:14 201417.9 
dmmckenn_pthorpeBasic Stumps ImplementationThu Jan 9 21:12:17 201417.9 
corgibasic decision stumpThu Jan 9 23:35:40 201417.9 
0108Bagging with decision stump as weak learnerSun Jan 12 14:19:36 201417.9 
PandaBearAdaboost on decision stumps, 500 roundsMon Jan 13 22:01:40 201417.9 
PandaBearAdaboost on decision stumps, 1000 roundsTue Jan 14 11:09:57 201417.9 
GodImplements Basic Decision Stumps and chooses the one which performs the bestTue Jan 14 14:35:40 201417.9 
hbKNN with L2 distance, k empirically set after cross validationTue Jan 14 14:50:47 201417.9 
LearnerImplementation of Adaboost with decision stumps as the weak learner.Tue Jan 14 15:17:02 201417.9 
LearnerImplementation of Adaboost with decision stumps as the weak learner. 100 rounds of boosting.Tue Jan 14 15:33:13 201417.9 
sabardDecision Stump weak learning algorithm to be used with AdaBoostTue Jan 14 15:39:42 201417.9 
GewangPredicts the classification label based on the k nearest neighborsTue Jan 7 11:43:15 201418.0 
Bob DonderoA decision tree learning algorithm using information gain and chi-squared pruning.Thu Jan 9 20:39:27 201418.2 
GewangPredicts the classification label based on the k nearest neighborsThu Jan 2 13:20:21 201418.3 
FannyA voted perceptron algorithm (epoch = 30)Wed Jan 8 13:14:08 201418.3 
The Whitman WhaleNearest neighbor classification with 17 neighbors and manhattan distanceThu Jan 9 22:48:28 201418.3 
R.A.B.k-NN with K = 21 and votes weighted by the inverse of distanceTue Jan 14 00:03:23 201418.3 
GewangPredicts the classification label based on the k nearest neighborsTue Jan 7 11:37:07 201418.4 
CTTTDecision Tree Algorithm with Chi-Squared Pre-PruningMon Jan 6 20:48:45 201418.5 
GewangPredicts the classification based on the k nearest neighbors during trainingSun Dec 29 11:39:28 201318.8 
bchouNearest 7-neighborsFri Jan 3 20:02:10 201419.0 
0108Adaboost with decision stump as weak learnerWed Jan 8 18:35:33 201419.0 
Mike HonchoAdaboost implementationTue Jan 14 12:23:10 201419.1 
HHA very simple learning algorithm that, on each test example, predicts the classification based on the k nearest neighbors during trainingSun Dec 29 11:29:18 201319.2 
mdrjrThis is an implementation of k nearest neighbors. I've played around with both k and the distance function.Thu Jan 9 20:04:54 201419.3 
Sunnannanearest neighbors algorithm with k = 7Mon Jan 13 14:49:40 201419.3 
weezyImplements a k-Nearest Neighbor algorithm with k = 27.Sat Jan 11 00:38:42 201420.0 
hbAdaBoost, KNN as week learner, k chosen empiricallyTue Jan 14 16:02:40 201420.4 
Wu-Tang DynastyAdaBoost using random sampling and Nearest NeighborsMon Jan 13 22:03:51 201421.3 
RockyNearest Neighbors with weighted vote(weight is inversely proportional to distance), Manhattan distance for normalized attributes, linear scan all examples to find K nearest neighbors(not good for very large training set)Thu Jan 9 10:11:26 201422.5 
CookieMonsterThis is a bagging algorithm which uses a nearest neighbor algorithm as its weak classifier.Mon Jan 6 20:47:22 201422.8 
CookieMonsterThis is a nearest-neighbor classifier which takes a majority vote from the k nearest points in feature space using Euclidean Distance.Thu Jan 9 22:57:25 201423.1 
bclamSingle-layer Neural Network - 10000 epochs, Learning rate of 0.05Tue Jan 14 07:20:25 201423.4 
ASappNearest Neighbor Algorithm with k = 5. Normalizes using mean and standard deviation of each attribute.Sat Jan 11 01:33:38 201423.6 
hbKNN with L2 distance, k empirically set after cross validationThu Jan 9 22:52:35 201423.7 
bclamSingle-layer Neural Network - 2000 epochs, Learning rate of 0.01Tue Jan 14 07:12:01 201423.7 
CookieMonsterThis is a bagging algorithm which uses a nearest neighbor algorithm as its weak classifier.Thu Jan 9 23:13:29 201423.9 
KhoaAn algorithm that classifies.Tue Jan 14 14:16:06 201423.9 
Jake BarnesSingle layer artificial neural network with 125 rounds of training. Learning rate is 0.1Wed Jan 8 16:47:50 201424.1 
Dr. Steve Brule (For Your Health)Neural Network trained for 100 epochs.Tue Jan 14 15:39:44 201424.6 
AFCAn (initial) implementation of K nearest neighbors with K = sqrt(number of training samples).Thu Jan 2 16:52:21 201424.8 
corgi2.0AdaBoost 150 w/ basic decision stumpsSat Jan 11 23:14:44 201425.9 
Nihar the GodUses Adaboost with decision stumps as weak learners and then uses 150 rounds of boostingTue Jan 14 15:07:58 201425.9 
weezyImplements a k-Nearest Neighbor algorithm with k = 15.Thu Jan 9 20:02:27 201426.0 
T.C.Multi-layered Neural Net, 100 iterations, .1 learning rateWed Jan 8 04:38:21 201426.9 
AFCAn implementation of K nearest neighbors with empirically optimized K values.Thu Jan 2 20:59:59 201427.8 
Joshua A. ZimmerA working (hopefully) attempt at a learning algorithm that uses weighting of the training examples via decision stumps to predict the classification of the test examples.Mon Jan 20 16:17:10 201428.3 
Joshua A. ZimmerA learning algorithm that uses weightingof the training examples via decision stumps to predict theclassification of the test examples.Tue Jan 14 16:01:11 201428.6 
ASappNearest Neighbor Algorithm with k = 5. Normalizes using mean and standard deviation of each attribute.Thu Jan 9 23:48:40 201429.5 
Nihar the GodUses Adaboost with decision stumps as weak learners and then uses 200 rounds of boostingTue Jan 14 15:12:20 201429.5 
Jake BarnesMultiple layer artificial neural network (5 hidden nodes) with 125 rounds of training. Learning rate is 0.1Mon Jan 13 17:03:22 201429.8 
Stephen McDonaldA K-nearest neighbours algorithm that predicts a test example by taking a majority vote of the k nearest neighbours, as measured by Manhattan distance (k is set to 1 for this trial). Additionally, this algorithm first convertsthe attribute types to numeric and normalizes each attribute to have zero mean and unit variance.Wed Jan 8 18:29:05 201430.5 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesSat Jan 11 18:34:43 201431.5 
hpBagging Decision StumpsThu Jan 2 13:59:51 201432.8 
L.M.K-nearestThu Jan 9 03:40:10 201432.9 
Shaheed ChaganiNaive Bayes ClassifierWed Dec 18 07:43:08 201333.9 
Wu-Tang DynastyAdaBoost using random sampling and Decision TreesSun Jan 12 22:48:11 201436.3 
ValyaImplements a single layer neural net, much like thealgorithm used in W6, with alpha = 0.1Sun Jan 12 20:44:34 201440.4 
ValyaImplements neural nets, much like thealgorithm used in W6, with alpha = 0.1Thu Jan 9 14:20:30 201441.0 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 100 rounds of boosting.Sun Jan 12 11:32:16 201441.0 
John WhelchelBasic implementation of AdaBoost using decision stumps as weak learners and 150 rounds of boosting.Sun Jan 12 20:29:33 201441.0 
Shaheed ChaganiNaive Bayes ClassifierSun Jan 12 22:44:50 201442.2 
tenrburritoAdaBoost algorithm with Decision Trees as weak learning algorithmThu Jan 9 15:24:10 201442.5 
Mike Honcho 10Adaboost implementationTue Jan 14 12:26:40 201445.2 
corgi3.0decision tree, discrete attribute splittingTue Jan 14 12:55:01 201446.4 
corgi3.0basic decision treeTue Jan 14 02:39:38 201447.3 
corgi4.0decision tree with chi squared pruningTue Jan 14 04:16:19 201447.3 
corgi4.0decision tree with chi squared pruningTue Jan 14 12:52:06 201447.3 
corgi5.0decision tree with chi squared pruningTue Jan 14 14:26:38 201447.3 
finn&jakeKnn, K=40, Euclidean distance for numeric and standardized distance for discrete variables; majority vote for nearest neighbors.Thu Jan 9 14:13:25 201448.4 
Igor ZabukovecSVMThu Jan 9 15:14:08 201448.6 
Aaron DollThis is an implementation of the random forest algorithmThu Jan 9 18:17:47 201448.6 
GuessingMinimally outputs a result by applying a random function.Sat Jan 11 15:26:17 201448.6 
GodImplements naive Bayes algorithm.Tue Jan 14 00:19:12 201448.6 
Linda ZhongBasic decision tree algorithm implementation , no pruning.Fri Jan 10 00:47:46 201450.0 
nullnullThu Jan 2 21:29:50 201451.4 
LearnerImplementation of Adaboost with decision stumps as the weak learner.Fri Jan 10 02:38:34 201451.4 
ASappNearest Neighbor Algorithm with k = 5. Normalizes using mean and standard deviation of each attribute.Sat Jan 11 01:24:41 201451.4 
Joshua A. ZimmerA learning algorithm that uses weightingof the training examples via decision stumps to predict theclassification of the test examples.Fri Jan 10 04:11:05 201452.4 
KhoaAn algo based on decision stumpsThu Jan 9 23:57:24 201453.0 
liltan decision stump implementationMon Jan 13 15:03:39 201461.2 
Estranged EgomaniacAdaBoost with decision stumps. 250 rounds of boosting.Sun Jan 12 13:54:21 201482.1 
sabardA decision tree learning algorithm.Thu Jan 9 23:04:47 201483.4 
Solving From ClassifierThe Naive Bayes algorithm executes the maximum-likelihood parameter learning problem and uses the learned parameters (obtained from observed attribute values) to find the maximum-likelihood naive Bayes hypothesis.Tue Jan 7 13:49:22 201491.2 
MariusAdaBoost using decision stumps as the weak-learning algorithms. It is run for 200 iterations.Thu Jan 9 20:13:44 201492.8 

Table generated: Mon Jan 20 16:17:21 2014