Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

Date

September 7, 1952

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18002

Human Generated Data

Title

Untitled (older man with military hat cooking outside with two women, one holding a three-tier cake)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

September 7, 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18002

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Clothing 99.6
Apparel 99.6
Person 99
Person 98.5
Person 98.3
Person 97.5
Person 96.4
Person 84.1
Overcoat 83.4
Coat 83.4
Suit 77.6
Face 64.3
Portrait 61.9
Photography 61.9
Photo 61.9
Shelter 58.7
Nature 58.7
Outdoors 58.7
Countryside 58.7
Building 58.7
Rural 58.7
Female 56.2
Tire 55.2

Clarifai
created on 2023-10-29

people 100
adult 98.2
group together 98.1
group 97.8
woman 97.2
man 95.6
administration 93.9
many 93.9
child 90.8
several 90.6
vehicle 90.3
leader 88.3
wear 88.1
monochrome 87.9
two 85.8
music 85.2
street 82.4
three 82.1
chair 80.6
five 78.8

Imagga
created on 2022-03-04

man 20.2
industry 19.6
industrial 19.1
old 17.4
musical instrument 17.3
factory 16.9
steel 15.2
building 14.9
uniform 14.3
work 13.5
power 13.4
people 13.4
clothing 13.1
machine 13
device 12.6
architecture 12.5
person 12.1
metal 12.1
military uniform 11.3
percussion instrument 10.9
transportation 10.8
shop 10.7
male 10.6
inside 10.1
street 10.1
danger 10
barbershop 10
seller 9.8
engine 9.6
men 9.4
light 9.4
iron 9.3
equipment 9
nuclear 8.7
military 8.7
war 8.7
fire 8.4
energy 8.4
city 8.3
environment 8.2
protection 8.2
room 8.2
religion 8.1
vehicle 8.1
working 7.9
business 7.9
adult 7.8
destruction 7.8
black 7.8
steam 7.8
pollution 7.7
wheel 7.7
stone 7.7
fashion 7.5
gun 7.5
vintage 7.4
smoke 7.4
safety 7.4
dirty 7.2
stall 7.2
accordion 7.2
machinery 7.1
lamp 7.1
mercantile establishment 7.1
plant 7

Google
created on 2022-03-04

Hat 85.5
Black-and-white 84.8
Style 83.8
Window 81.8
Monochrome photography 73.9
Vintage clothing 73.3
Monochrome 73.2
Motor vehicle 69
History 65.3
Sun hat 64.6
Street 63.8
Musician 63.6
Stock photography 62.8
City 61.3
Classic 60.8
Art 60.2
Market 57.7
Hawker 57.4
Room 54.5
Metal 51.6

Microsoft
created on 2022-03-04

person 95.8
clothing 95.3
black and white 91.3
text 75.8
man 66.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 95.4%
Calm 90.6%
Surprised 4.5%
Disgusted 1.3%
Angry 1.2%
Sad 1.1%
Confused 0.8%
Happy 0.4%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 96.7%
Calm 87%
Sad 6.5%
Happy 1.4%
Confused 1.4%
Disgusted 1.3%
Angry 0.8%
Fear 0.8%
Surprised 0.7%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Sad 79.9%
Angry 12.9%
Fear 2.6%
Calm 1.6%
Happy 0.8%
Surprised 0.8%
Confused 0.7%
Disgusted 0.7%

AWS Rekognition

Age 30-40
Gender Male, 72.3%
Calm 49.3%
Sad 28.2%
Fear 7.7%
Confused 5%
Happy 4.2%
Surprised 2.5%
Angry 1.7%
Disgusted 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99%
Person 98.5%
Person 98.3%
Person 97.5%
Person 96.4%
Person 84.1%

Text analysis

Amazon

NOGOX
YT37A°2 NOGOX
YT37A°2
٢ад