Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Christopher Wilmarth, American 1943 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.94

Copyright

© Estate of Christopher Wilmarth

Human Generated Data

Title

Untitled

People

Artist: Christopher Wilmarth, American 1943 - 1987

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Susan Wilmarth, 2014.94

Copyright

© Estate of Christopher Wilmarth

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Tripod 99.5
Human 99.4
Person 99.4
Person 98.6
Photo 83.8
Photography 83.8

Clarifai
created on 2018-02-09

people 99.7
monochrome 99.5
man 96.8
adult 96.7
one 95.5
two 95.1
chair 94
woman 89.7
furniture 87.3
room 86.9
indoors 86.8
street 85.1
black and white 84.3
portrait 84.3
art 84.1
step 82.7
actor 81.8
child 81.5
leader 78.6
music 78.2

Imagga
created on 2018-02-09

tripod 84.5
rack 64.8
support 49.9
man 28.2
male 22.7
silhouette 22.4
people 21.2
person 16.4
cleaner 15.5
crutch 15.1
business 13.4
barrier 13.2
men 12.9
building 12.8
adult 12.3
outdoor 12.2
black 12
staff 11.9
sport 11.6
posing 11.6
microphone 11.5
working 11.5
attractive 11.2
obstruction 10.9
sky 10.8
sunset 10.8
one 10.5
equipment 9.8
businessman 9.7
portrait 9.7
stick 9.7
sun 9.7
walking 9.5
work 9.4
light 9.4
fashion 9
structure 8.5
professional 8.4
studio 8.4
human 8.2
window 8.2
outdoors 8.2
device 8.1
worker 8
job 8
ladder 7.9
standing 7.8
modern 7.7
happy 7.5
city 7.5
holding 7.4
occupation 7.3
alone 7.3
office 7.2
sexy 7.2
life 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

wall 95.9
indoor 91.5
white 62.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-25
Gender Female, 97.9%
Confused 1.7%
Happy 83.1%
Surprised 1.3%
Angry 1.2%
Sad 2.5%
Calm 8.2%
Disgusted 2.1%

Microsoft Cognitive Services

Age 14
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

interior objects 99.8%
pets animals 0.1%