Human Generated Data

Title

Photo by Kip Ruhl

Date

2001

People

Artist: Larry Stark, American born 1941

Classification

Prints

Human Generated Data

Title

Photo by Kip Ruhl

People

Artist: Larry Stark, American born 1941

Date

2001

Classification

Prints

Machine Generated Data

Tags

Amazon

Art 98.2
Human 91.1
Person 91.1
Painting 79.2
Photo 62.7
Photography 62.7
Face 62

Clarifai

paper 97.5
blank 96.9
bill 96.8
portrait 96.2
message 95.9
old 95.7
empty 95.3
people 95.1
one 92.6
picture frame 92.6
painting 91.1
man 90.5
desktop 90.2
display 89.8
adult 89.7
card 89.5
vintage 89.4
art 88.9
billboard 87.5
blank space 84.2

Imagga

baby 35.1
orangutan 27.5
primate 22.8
ape 22.8
wildlife 21.4
fetus 20.7
vertebrate 20.4
brown 17.6
monkey 16.9
animals 14.8
wild 13.9
mammal 12.6
eyes 12
gold 11.5
face 11.4
close 10.8
endangered 10.8
yellow 10.6
travel 10.5
statue 10.4
one 10.4
bear 10.3
chordate 10.2
fur 10.2
water 10
tree 10
old 9.7
zoo 9.6
carving 9.6
sculpture 9.6
device 9.5
conservation 9.5
culture 9.4
natural 9.4
black 9
religion 9
nut and bolt 8.9
art 8.8
closeup 8.7
jungle 8.7
ancient 8.6
cute 8.6
holiday 8.6
golden 8.6
head 8.4
wood 8.3
toy 8.3
eye 8
primates 7.9
slow 7.8
orange 7.7
container 7.6
fastener 7.5
vintage 7.4
mongoose 7.4
symbol 7.4
single 7.4
support 7.2
wet 7.1

Google

Microsoft

gallery 74.5
room 42.3
picture frame 37
screenshot 28
art 28
animal 7.9
toy 5
fish 4.3
aquarium 4.1
bear 3
cat 2.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 60-90
Gender Male, 99.7%
Happy 1.5%
Sad 43.4%
Disgusted 0.9%
Angry 4%
Calm 43.1%
Surprised 1.2%
Confused 5.8%

Microsoft Cognitive Services

Age 61
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.1%
Painting 79.2%

Captions

Microsoft

a screen shot of a social media post 53.8%