Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.105

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.105

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Tripod 99.4
Person 99
Human 99
Photographer 66.1
Photography 65.8
Photo 65.8

Clarifai
created on 2018-02-09

people 99
one 98.8
adult 98
woman 95.1
portrait 94.1
man 89.9
wear 87.7
movie 86.9
two 83
administration 81.9
actor 81.9
actress 77.6
chair 73.5
child 72.2
retro 70.7
facial expression 70.2
ladder 69.9
music 67.8
indoors 66.8
recreation 66.6

Imagga
created on 2018-02-09

swab 34.4
man 26.2
cleaner 24.8
tripod 24.6
crutch 23.5
person 20.7
male 20.7
adult 20
sport 19.3
people 19
rack 18.8
staff 17.4
support 17.2
stick 16.7
outdoors 16.4
attractive 16.1
men 15.5
pretty 15.4
one 14.9
happy 14.4
work 13.3
walking 13.3
peg 12.6
lifestyle 12.3
outdoor 12.2
smile 12.1
device 11.7
worker 11.6
golf 11.5
urban 11.4
fashion 11.3
fun 11.2
playing 10.9
exercise 10.9
city 10.8
active 10.8
golfer 10.7
posing 10.7
standing 10.4
outside 10.3
prosthesis 10.1
professional 10.1
street 10.1
engineer 10
working 9.7
business 9.7
portrait 9.7
sexy 9.6
player 9.5
bag 9.3
cute 9.3
clean 9.2
leisure 9.1
ball 9.1
game 8.9
success 8.8
indoors 8.8
play 8.6
life 8.6
club 8.5
senior 8.4
health 8.3
alone 8.2
lady 8.1
black 8.1
building 8
job 8
smiling 8
women 7.9
happiness 7.8
corrective 7.6
human 7.5
silhouette 7.4
action 7.4
equipment 7.4
fit 7.4
tool 7.2
suit 7.2
body 7.2
grass 7.1
to 7.1
summer 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 93.9
floor 91.8
tripod 18.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 92%
Surprised 7.5%
Angry 11.2%
Sad 9.7%
Calm 44.1%
Disgusted 5.9%
Confused 12.9%
Happy 8.6%

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%