Human Generated Data

Title

Untitled (VE Day: crowd of people on city street reading news on window)

Date

1945

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15353

Human Generated Data

Title

Untitled (VE Day: crowd of people on city street reading news on window)

People

Artist: Jack Gould, American

Date

1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 98
Human 98
Person 97.7
Person 96.9
Person 95.2
Person 95.1
Person 94.7
Person 88
Person 81.4
Crowd 76.8
Face 76.8
Hat 76.7
People 67.6
Sun Hat 67.4
Meal 62.8
Food 62.8
Photography 62
Photo 62
Portrait 61.8
Indoors 60
Audience 59.8
Cap 59.4
Classroom 56.8
Room 56.8
School 56.8
Person 48.3

Imagga
created on 2022-03-05

surgeon 22.6
man 20.1
people 19.5
passenger 18.4
shop 16.7
work 15.7
business 15.2
male 14.9
person 14.1
old 13.9
building 13.8
stall 13
men 12.9
industry 12.8
hospital 12.7
mercantile establishment 12.5
city 12.5
adult 12.4
architecture 11.7
room 11.1
industrial 10.9
job 10.6
urban 10.5
patient 10.4
occupation 10.1
worker 9.8
daily 9.8
working 9.7
factory 9.6
doctor 9.4
medical 8.8
illness 8.6
construction 8.5
nurse 8.5
barbershop 8.4
equipment 8.4
place of business 8.2
history 8
metal 8
looking 8
uniform 7.9
medicine 7.9
wheeled vehicle 7.8
modern 7.7
power 7.5
fashion 7.5
house 7.5
human 7.5
home 7.2
transportation 7.2
clothing 7.2
wagon 7.1
vehicle 7.1
steel 7.1
shoe shop 7
travel 7
sky 7

Google
created on 2022-03-05

Photograph 94.2
Black 89.6
Motor vehicle 85.5
Hat 82.6
Line 81.9
Font 75.5
Monochrome 74
Monochrome photography 72.8
Event 72.6
Team 69.1
Sun hat 67.2
Stock photography 66.4
Crew 66
Cap 65.2
History 64.1
Room 62.8
Crowd 61.4
Art 58.2
Vintage clothing 55.9
Illustration 55.8

Microsoft
created on 2022-03-05

text 99.2
clothing 93.6
person 89.9
black and white 88.9
man 86.8
people 67.9
house 56.8
shop 14.8

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 86.5%
Calm 93.5%
Confused 2.5%
Sad 1.1%
Happy 0.8%
Disgusted 0.6%
Surprised 0.6%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 34-42
Gender Male, 55.8%
Calm 98.4%
Surprised 1%
Confused 0.2%
Sad 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Happy 46.6%
Calm 30.8%
Disgusted 6.1%
Fear 5.3%
Surprised 5.3%
Sad 2.1%
Confused 1.9%
Angry 1.8%

AWS Rekognition

Age 36-44
Gender Female, 97.5%
Calm 100%
Sad 0%
Angry 0%
Disgusted 0%
Surprised 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Male, 99.2%
Calm 67%
Happy 13.9%
Fear 10%
Surprised 4.7%
Confused 1.8%
Sad 1.4%
Disgusted 0.7%
Angry 0.4%

AWS Rekognition

Age 6-16
Gender Female, 64.3%
Calm 60.1%
Sad 29.6%
Fear 2.9%
Confused 2.1%
Surprised 1.7%
Happy 1.6%
Disgusted 1%
Angry 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Hat 76.7%

Captions

Microsoft

a group of people standing in front of a store window 80.7%
a group of people in front of a store window 80.5%
a group of people standing in front of a store 80.4%

Text analysis

Amazon

GERMAN
ENDS
WAR ENDS
WAR
arro
ЫГИ
DUSCO ЫГИ
DUSCO
NEL
OX-CIS
ONIC
KEIKI

Google

GERMAN
GERMAN WAR ENDS MJI7 YT3RA
MJI7
YT3RA
WAR
ENDS