Human Generated Data

Title

Untitled (people waiting to give blood)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7636

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people waiting to give blood)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 99.3
Person 98.8
Person 98.5
Person 98.4
Person 96.9
Person 96.7
Person 95.5
Person 91.4
Chair 90.8
Furniture 90.8
Indoors 90.3
Interior Design 90.3
Room 90
Restaurant 88.6
Clothing 80.3
Shorts 80.3
Apparel 80.3
Cafeteria 75.3
Person 72.2
Person 70.5
People 67.7
Food 64.3
Meal 64.3
Cafe 62.7
Advertisement 59.2
Poster 59
Classroom 59
School 59
Photography 57.1
Photo 57.1
Living Room 55.6
Collage 55.2

Imagga
created on 2022-01-08

barbershop 97
shop 86.6
mercantile establishment 65.2
place of business 43.4
people 27.3
man 23.5
person 22.1
establishment 21.6
adult 19.1
men 18.9
male 17.7
musical instrument 16.2
chair 15.8
interior 15
business 14.6
black 14.4
accordion 13.2
salon 12.9
human 12.7
indoors 12.3
working 11.5
room 11
portrait 11
city 10.8
keyboard instrument 10.5
urban 10.5
home 10.4
indoor 10
sport 10
worker 9.8
businessman 9.7
women 9.5
sitting 9.4
motion 9.4
lifestyle 9.4
wind instrument 9.2
life 9.1
fashion 9
one 9
computer 8.8
youth 8.5
art 8.5
modern 8.4
style 8.2
group 8.1
office 8
equipment 8
work 7.8
boy 7.8
model 7.8
player 7.7
grunge 7.7
casual 7.6
leisure 7.5
holding 7.4
speed 7.3
window 7.3
design 7.3
music 7.2
body 7.2
transportation 7.2
handsome 7.1
to 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.5
clothing 80.8
furniture 79.2
person 76.8
table 75.7
chair 70.9
black and white 60.3
woman 54.1
house 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 97.5%
Calm 98%
Sad 0.7%
Confused 0.7%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Female, 99.9%
Calm 65.7%
Happy 32.8%
Surprised 0.4%
Sad 0.4%
Disgusted 0.3%
Angry 0.3%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 90.8%

Captions

Microsoft

a group of people in a room 80.7%
a group of people standing in a room 76.2%
a group of people around each other 56.6%

Text analysis

Amazon

STOKET
20944A
DEFENSE
as
20944A.
HO
YACOX
NAMTE
٧٤٢١٨ج

Google

20944
HO
209
44A•
A.
NEFENSI
20944 A. NEFENSI HO 209 44A•