Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.102

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.102

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Tripod 99.9
Person 99.5
Human 99.5

Clarifai
created on 2018-02-09

people 99.5
one 98.7
adult 97.9
woman 94.1
two 92.8
portrait 90.8
man 89.1
indoors 88.5
wear 86.6
actress 85.6
room 84.3
child 83.8
tripod 81.8
step 81.6
actor 81.4
facial expression 81.1
ladder 79.7
movie 79.7
recreation 77.4
chair 76.5

Imagga
created on 2018-02-09

tripod 100
rack 100
support 100
man 25.5
male 22.7
person 20.7
people 18.4
engineer 15.2
outdoors 14.9
men 14.6
adult 13.6
camera 12.9
equipment 12.7
working 12.4
sport 12.3
golf 11.5
happy 11.3
professional 11
playing 10.9
golfer 10.7
outdoor 10.7
one 10.4
walking 10.4
outside 10.3
construction 10.3
black 10.2
work 10.2
silhouette 9.9
studio 9.9
shoot 9.7
photograph 9.6
player 9.4
club 9.4
photographer 9.4
leisure 9.1
attractive 9.1
sky 8.9
worker 8.9
ball 8.7
standing 8.7
water 8.7
industry 8.5
hobby 8.5
business 8.5
bag 8.5
vacation 8.2
technology 8.2
active 8.1
activity 8.1
building 7.9
grass 7.9
portrait 7.8
play 7.7
travel 7.7
lens 7.7
pretty 7.7
landscape 7.4
safety 7.4
occupation 7.3
alone 7.3
hat 7.3
industrial 7.3
lifestyle 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

posing 36.6
tripod 23.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 96.9%
Happy 4.3%
Confused 7.3%
Sad 32.6%
Angry 8.2%
Calm 40.7%
Disgusted 4.6%
Surprised 2.3%

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

interior objects 99.9%