Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

Date

September 7, 1952

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18003

Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

September 7, 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18003

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 99.2
Person 98.9
Person 98.3
Person 97.1
Person 91.6
Musician 88.1
Musical Instrument 88.1
Clothing 87.1
Apparel 87.1
Person 83
Drummer 64.2
Percussion 64.2
Pot 58.9
Coat 56.8

Clarifai
created on 2019-11-16

people 100
adult 98.7
group together 98.3
group 97.6
man 97
woman 95.2
street 94.7
vehicle 91.5
wear 89.7
merchant 88.3
one 87.4
two 85.9
many 82.9
transportation system 82.9
administration 81.5
boy 81.5
military 80.8
several 79.2
child 78.9
actor 77.3

Imagga
created on 2019-11-16

steel drum 79.9
percussion instrument 77
musical instrument 63.3
industrial 18.1
people 17.8
industry 17.1
interior 16.8
business 15.8
factory 15.5
steel 15.4
metal 15.3
man 14.8
work 14.3
kitchen 13.4
chair 12.4
machine 11.8
city 11.6
transportation 10.7
person 10.7
drum 10.7
working 10.6
old 10.4
home 10.4
black 10.2
architecture 10.1
indoors 9.7
building 9.6
travel 9.1
modern 9.1
adult 9.1
technology 8.9
room 8.6
sitting 8.6
power 8.4
furniture 8.3
fun 8.2
cook 8.2
environment 8.2
light 8
worker 8
lifestyle 7.9
cooking 7.9
men 7.7
line 7.7
energy 7.6
structure 7.5
house 7.5
street 7.4
inside 7.4
computer 7.3
equipment 7.3
transport 7.3
home appliance 7.3
food 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 94.3
person 92.8
clothing 87.8
black and white 83.3
waste container 61.9
cooking 54.3
preparing 44.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-69
Gender Male, 51.2%
Happy 45.2%
Fear 45%
Sad 45%
Confused 45.1%
Disgusted 45.1%
Angry 45.1%
Calm 54.5%
Surprised 45.1%

AWS Rekognition

Age 55-73
Gender Male, 54.1%
Calm 48.3%
Confused 45.6%
Happy 45.1%
Disgusted 45.1%
Surprised 45%
Sad 50.6%
Fear 45.1%
Angry 45.2%

AWS Rekognition

Age 51-69
Gender Male, 54%
Angry 45.6%
Sad 53.8%
Calm 45.3%
Disgusted 45%
Confused 45.1%
Surprised 45%
Fear 45.1%
Happy 45%

AWS Rekognition

Age 31-47
Gender Female, 50.3%
Angry 50.2%
Happy 49.6%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.6%
Sad 49.5%

Microsoft Cognitive Services

Age 63
Gender Male

Microsoft Cognitive Services

Age 67
Gender Male

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 88.9%
people portraits 5.9%
food drinks 3.6%