Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.103

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.103

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Tripod 99.9
Person 99.6
Human 99.6
Footwear 85.2
Apparel 85.2
Shoe 85.2
Clothing 85.2

Clarifai
created on 2018-02-09

people 99.4
one 98.8
adult 97.9
woman 94.8
man 92.6
two 92.6
portrait 91.8
recreation 89
wear 83.9
child 82.4
administration 80.7
indoors 78.7
room 78.1
facial expression 77.8
street 77.4
walking stick 76.5
chair 76.2
actress 74.9
three 72.3
monochrome 72.1

Imagga
created on 2018-02-09

tripod 100
rack 100
support 100
man 23.5
sport 22.2
male 19.8
person 18.3
people 17.8
outdoor 16.8
outdoors 15.7
golf 15.3
walking 15.1
adult 14.9
men 13.7
golfer 13.7
ball 13.1
engineer 13
grass 12.6
club 12.2
playing 11.8
active 11.7
leisure 11.6
sky 11.5
black 11.4
player 11.3
play 11.2
one 11.2
attractive 11.2
happy 10.6
summer 10.3
outside 10.3
work 10.2
camera 10.2
recreation 9.9
travel 9.9
vacation 9.8
game 9.8
pretty 9.8
golfing 9.8
working 9.7
portrait 9.7
course 9.5
hobby 9.5
equipment 9.4
professional 9.3
silhouette 9.1
water 8.7
lifestyle 8.7
photograph 8.6
old 8.4
studio 8.4
hat 8.2
suit 8.1
photographer 8.1
sexy 8
mountain 8
putter 7.9
putting 7.8
shoot 7.7
construction 7.7
industry 7.7
hole 7.7
walk 7.6
field 7.5
senior 7.5
landscape 7.4
sports 7.4
exercise 7.3
sunset 7.2
shadow 7.2
activity 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

ground 96.9
tripod 21.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Male, 99.1%
Surprised 7.7%
Disgusted 7.7%
Calm 20.9%
Confused 11.5%
Angry 13.3%
Happy 6%
Sad 32.9%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 85.2%

Categories