Human Generated Data

Title

Untitled (older man with military hat cooking and two older women, one holding cake, in backyard

Date

September 7, 1952

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18000

Human Generated Data

Title

Untitled (older man with military hat cooking and two older women, one holding cake, in backyard

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

September 7, 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18000

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 98.8
Human 98.8
Person 98.7
Person 97.4
Person 95
Clothing 95
Apparel 95
Person 90.1
Musician 86.2
Musical Instrument 86.2
Person 81.9
Female 78.3
Leisure Activities 73.4
Performer 70.7
Chair 67.8
Furniture 67.8
People 67
Woman 64.1
Dress 64.1
Portrait 62.7
Face 62.7
Photography 62.7
Photo 62.7
Meal 61.2
Food 61.2
Brick 56.5
Guitarist 55.5
Guitar 55.5
Suit 55.3
Coat 55.3
Overcoat 55.3

Clarifai
created on 2023-10-29

people 99.9
group together 98
adult 97.1
group 97.1
woman 96.8
chair 95.6
man 95.4
child 94.5
furniture 94.3
monochrome 93.8
several 92.7
two 92.3
street 90.5
home 90.3
many 88.1
seat 88
four 85.9
three 85.8
family 85
wear 84.2

Imagga
created on 2022-03-04

seller 30.2
stall 28.9
building 19
factory 17.9
industry 17.1
city 16.6
old 16
machine 15.1
street 13.8
industrial 13.6
man 13.4
architecture 13.3
steel 13.3
shop 12.9
work 12.8
metal 12.1
barbershop 11.8
people 11.7
urban 11.4
men 11.2
power 10.9
transportation 10.8
light 10.7
equipment 10.4
house 10
male 9.9
working 9.7
mercantile establishment 9.5
inside 9.2
machinery 8.9
construction 8.6
chair 8.2
religion 8.1
person 8
manufacturing 7.8
labor 7.8
heavy 7.6
business 7.3
passenger 7.2
art 7.2
night 7.1
job 7.1
interior 7.1
device 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 95.2
black and white 93.2
text 92
clothing 91.1
person 90.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 52.8%
Calm 86.7%
Happy 6.1%
Disgusted 2.3%
Surprised 1.9%
Confused 1.5%
Sad 0.7%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 75.5%
Sad 18%
Happy 2.2%
Confused 1.5%
Surprised 0.8%
Disgusted 0.8%
Fear 0.7%
Angry 0.4%

AWS Rekognition

Age 41-49
Gender Male, 99%
Sad 99.4%
Happy 0.2%
Fear 0.1%
Calm 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 98.7%
Person 97.4%
Person 95%
Person 90.1%
Person 81.9%

Text analysis

Amazon

٢ад

Google

YT37A°2-
YT37A°2-