Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.106

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.106

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Tripod 99.7
Person 99.2
Human 99.2
Photographer 75.2
Photography 72.5
Photo 72.5

Clarifai
created on 2018-02-09

people 99.2
one 99.1
adult 97.9
woman 94.5
man 92.8
portrait 92
two 91.6
wear 86
administration 81.1
child 79.9
chair 75.1
music 74.9
actress 73.8
retro 70.7
home 69.6
actor 69.1
three 67.7
musician 67.5
recreation 67.5
pants 65.8

Imagga
created on 2018-02-09

tripod 46.5
engineer 37.8
rack 35.6
man 30.2
support 28
cleaner 23.8
male 22
person 19.4
people 16.7
adult 16.2
vacuum 15.2
outdoors 14.9
equipment 14.5
one 14.2
working 14.1
men 13.7
work 13.3
worker 13.3
sport 13.2
outdoor 13
tool 12.5
walking 12.3
attractive 11.9
happy 11.3
grass 11.1
professional 11
business 10.9
pretty 10.5
black 10.2
lifestyle 10.1
bag 9.9
job 9.7
businessman 9.7
golf 9.5
occupation 9.2
leisure 9.1
portrait 9.1
building 8.7
lawn 8.5
club 8.5
senior 8.4
swab 8.2
exercise 8.2
smiling 8
ball 7.9
smile 7.8
golfer 7.8
crutch 7.8
device 7.7
summer 7.7
fashion 7.5
clean 7.5
technology 7.4
alone 7.3
success 7.2
suit 7.2
active 7.2
posing 7.1

Google
created on 2018-02-09

white 96.4
photograph 96
black 95.7
standing 95.3
black and white 89.9
photography 84.6
snapshot 81.8
monochrome photography 74.3
girl 72.7
sitting 72.3
design 65.4
monochrome 65.1
floor 61.6
furniture 54.8
house 54.6
joint 51.7
flooring 51
fun 51
pattern 50.9
vintage clothing 50.7

Microsoft
created on 2018-02-09

person 90.5
tripod 18.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 74.9%
Angry 9.5%
Disgusted 2.5%
Sad 28.5%
Happy 3%
Calm 48%
Surprised 2.6%
Confused 5.9%

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%