Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

Date

September 7, 1952

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18004

Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

September 7, 1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 98.8
Person 98.7
Person 98.2
Person 96.4
Musical Instrument 94.4
Musician 94.4
Person 91
Percussion 75.1
Drummer 75.1
Apparel 67.9
Clothing 67.9

Clarifai
created on 2019-11-16

people 100
group 99.1
adult 98.4
group together 97.7
many 95.7
administration 94.2
man 94.2
wear 93.3
vehicle 91
several 89
woman 88.6
outfit 88.3
leader 87.5
street 86.4
merchant 85.9
two 83.1
furniture 82.3
military 81.8
one 81.4
veil 80.1

Imagga
created on 2019-11-16

man 34.9
barbershop 33.6
shop 30.1
people 27.9
office 25
male 24.1
work 23.3
person 22.7
mercantile establishment 21.7
working 20.3
laptop 20.3
computer 20.2
business 20
adult 17.7
job 17.7
men 17.2
happy 16.9
home 16.7
professional 15.7
indoors 14.9
chair 14.9
smiling 14.5
place of business 14.4
doctor 14.1
hospital 14.1
worker 13.5
businessman 13.2
medical 13.2
room 12.9
smile 12.1
technology 11.9
medicine 11.4
stall 11.4
sitting 11.2
corporate 11.2
health 11.1
patient 11.1
portrait 11
occupation 11
uniform 10.8
hairdresser 10.7
clinic 10.3
executive 10.2
house 10
family 9.8
success 9.7
barber chair 9.6
30s 9.6
senior 9.4
table 8.9
seller 8.8
desk 8.8
seat 8.6
meeting 8.5
communication 8.4
mature 8.4
old 8.4
equipment 8.3
lifestyle 7.9
casual clothing 7.8
mother 7.8
clothing 7.8
illness 7.6
workplace 7.6
two 7.6
nurse 7.6
talking 7.6
manager 7.4
successful 7.3
businesswoman 7.3
newspaper 7.3
establishment 7.2
team 7.2
surgeon 7.1
interior 7.1
happiness 7
together 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.4
clothing 94.7
person 94.2
indoor 85.2
old 66.5
man 61.4
black and white 50.3
clothes 18.9
several 11.2
cluttered 10.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-69
Gender Male, 52.7%
Angry 45%
Surprised 45%
Sad 45%
Happy 55%
Disgusted 45%
Confused 45%
Calm 45%
Fear 45%

AWS Rekognition

Age 43-61
Gender Female, 54.1%
Calm 48.6%
Fear 45%
Confused 45.5%
Happy 50.2%
Angry 45.1%
Disgusted 45.2%
Sad 45.3%
Surprised 45.1%

AWS Rekognition

Age 50-68
Gender Male, 53.5%
Happy 45.9%
Fear 45%
Angry 45.1%
Confused 45.3%
Surprised 45.1%
Disgusted 45.1%
Calm 53.2%
Sad 45.3%

AWS Rekognition

Age 24-38
Gender Female, 50.2%
Fear 49.5%
Angry 49.5%
Calm 49.5%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 50.4%
Disgusted 49.5%

Microsoft Cognitive Services

Age 58
Gender Female

Microsoft Cognitive Services

Age 69
Gender Male

Microsoft Cognitive Services

Age 62
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing next to a window 87.8%
a group of people standing in a room 87.7%
a group of people standing in front of a store 87.1%