Human Generated Data

Title

Untitled (office, woman sitting at typewriter)

Date

1953

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22303

Human Generated Data

Title

Untitled (office, woman sitting at typewriter)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.5
Human 99.5
Chair 99
Furniture 99
Person 97.7
Restaurant 95.8
Person 95.2
Table 89.1
Indoors 86.9
Cafeteria 82.8
Building 78.4
Housing 78.4
Room 76.2
Living Room 76.2
Desk 75
Dining Table 74.6
Cafe 74
Sitting 73.6
Meal 67
Food 67
Apparel 65.4
Clothing 65.4
Flooring 60.6
Face 60.5
Office 57.2
Couch 57.1
Shelf 56
Screen 55.9
Electronics 55.9
Monitor 55.8
Display 55.8
LCD Screen 55.3

Imagga
created on 2022-03-11

chair 60.5
interior 56.6
furniture 51.7
room 51.5
table 42.4
barber chair 36.5
salon 36.1
modern 35.1
kitchen 32.5
seat 32.4
home 31.1
house 30.9
shop 29.1
barbershop 28.5
decor 27.4
indoors 27.2
floor 23.2
design 22.5
office 21.8
wood 20
mercantile establishment 19.5
window 19.3
restaurant 19.3
architecture 18.7
inside 17.5
desk 17.5
light 17.4
indoor 17.4
luxury 17.2
apartment 16.3
counter 15.6
dining 15.2
decoration 15.2
style 14.8
glass 14.8
work 14.1
empty 13.7
people 13.4
lamp 13.3
nobody 13.2
place of business 13.1
furnishing 12.9
residential 12.5
comfortable 12.4
contemporary 12.2
cabinet 12.2
business 12.1
clinic 12.1
wall 12
elegance 11.8
chairs 11.8
lifestyle 11.6
stove 11.5
living 11.4
food 11
domestic 10.9
3d 10.8
mirror 10.5
computer 10.4
service 10.2
vase 9.7
hotel 9.6
classroom 9.3
appliance 9.3
oven 8.8
tile 8.8
elegant 8.6
estate 8.5
man 8.1
building 8.1
metal 8
working 8
medical 7.9
hospital 7.9
wooden 7.9
tables 7.9
sink 7.8
structure 7.8
space 7.8
dinner 7.7
check 7.7
person 7.7
expensive 7.7
sofa 7.7
clean 7.5
establishment 7.5
plant 7.5
bar 7.4
male 7.1
steel 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

furniture 97.8
indoor 96.5
table 95.8
office building 83.9
text 83.6
chair 83.4
black and white 81.6
desk 81.2
sink 73.7
house 70.8
building 65.4
room 51.6
cluttered 14.9

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Female, 75.4%
Calm 100%
Happy 0%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 61.5%
Calm 99.7%
Sad 0.1%
Disgusted 0.1%
Surprised 0%
Happy 0%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Female, 93.8%
Calm 91.5%
Sad 6.5%
Fear 0.5%
Confused 0.4%
Surprised 0.3%
Happy 0.3%
Disgusted 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 99%

Captions

Microsoft

a person sitting at a desk 91.8%
a person sitting on a desk 86.6%
a person sitting at a desk in a room 86.5%

Text analysis

Amazon

KODVK-2.NEEL