Human Generated Data

Title

Untitled (commercial laundry operation)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1551

Human Generated Data

Title

Untitled (commercial laundry operation)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1551

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99
Human 99
Person 95.9
Room 95.8
Indoors 95.8
Interior Design 94.8
Furniture 87.7
Bird 85.2
Animal 85.2
Bird 83.9
Bedroom 77.3
People 76.1
Crowd 75.3
Person 72.2
Person 66.5
Bird 66
Bed 61.2
Classroom 58.4
School 58.4
Dorm Room 58.1
Housing 57.2
Building 57.2
Audience 57
Waiting Room 56.4
Person 51.4
Person 42.1

Clarifai
created on 2023-10-15

people 99.4
many 99.3
group 97.5
adult 94.7
man 94.4
war 94.3
crowd 94.3
monochrome 89.4
group together 85.2
room 83.8
military 81
education 80.8
street 76.8
no person 76.4
woman 75.6
soldier 75.3
administration 73.5
child 72.9
furniture 71.7
skirmish 71.4

Imagga
created on 2021-12-14

steel drum 100
percussion instrument 100
musical instrument 84.7
building 21.2
city 17.4
travel 16.2
winter 15.3
architecture 14.8
house 14.2
old 13.2
snow 12.9
street 12.9
transportation 12.5
structure 12.5
people 12.3
balcony 11.9
urban 11.4
sky 10.8
light 10.7
clothing 10.6
modern 10.5
cold 10.3
decoration 10.3
shop 9.5
scene 9.5
stone 9.3
business 9.1
drum 8.9
window 8.8
brassiere 8.6
holiday 8.6
interior 8
women 7.9
glass 7.8
outdoor 7.6
horse 7.6
historical 7.5
wood 7.5
landscape 7.4
transport 7.3
tourist 7.2
metal 7.2
home 7.2
history 7.1
chair 7.1
wooden 7
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.4
drawing 86.1
sketch 65.4
black and white 56.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-69
Gender Male, 82.9%
Calm 96.3%
Sad 1.2%
Happy 1.1%
Surprised 0.6%
Angry 0.3%
Confused 0.3%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Bird 85.2%

Categories

Text analysis

Amazon

DEL
DE
M 11

Google

M T3AZ O
M
T3AZ
O