Human Generated Data

Title

Untitled (department store interior)

Date

c. 1955

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.640

Human Generated Data

Title

Untitled (department store interior)

People

Artist: Claseman Studio, American 20th century

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 94
Human 94
Apparel 92.3
Clothing 92.3
Shop 89.2
Chair 85.3
Furniture 85.3
Boutique 83.9
Person 83.8
Mannequin 78.4
Flooring 77.5
Art Gallery 69.7
Art 69.7
Floor 67.3

Imagga
created on 2022-01-09

chair 100
seat 81.2
furniture 60.2
interior 43.3
wheelchair 43.3
room 37
table 30.6
modern 27.3
rocking chair 25.9
house 22.6
floor 22.3
inside 21.2
architecture 21.1
indoors 21.1
window 21.1
home 19.1
chairs 18.6
wood 18.3
furnishing 18.1
light 18
empty 18
design 18
decor 17.7
shopping cart 17.6
barber chair 16.2
comfortable 15.3
urban 14.9
indoor 14.6
office 13.7
device 13.4
handcart 13.4
business 13.4
support 13.2
style 12.6
apartment 12.5
glass 12.4
nobody 12.4
space 12.4
shop 12.2
work 11.8
city 11.6
decoration 11.6
wheeled vehicle 11.4
building 11.3
people 11.2
lamp 11
equipment 10.8
dining 10.5
health 10.4
contemporary 10.3
luxury 10.3
hospital 10.1
restaurant 10
tables 9.8
metal 9.7
barbershop 9.5
living 9.5
wall 9.4
life 9.4
rest 9.3
elegance 9.2
salon 8.9
structure 8.8
corridor 8.8
residential 8.6
dinner 8.4
relaxation 8.4
armchair 8.3
care 8.2
door 8.2
transportation 8.1
working 8
scene 7.8
container 7.8
sitting 7.7
stool 7.6
relax 7.6
lifestyles 7.6
wheel 7.5
plant 7.5
bar 7.4
lifestyle 7.2
hall 7.1
medical 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

furniture 98.9
table 96.5
chair 95.7
text 95
black and white 69.7
room 50.3

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 100%
Calm 59.3%
Happy 26.8%
Fear 6.6%
Surprised 3%
Angry 1.5%
Sad 1.4%
Confused 0.8%
Disgusted 0.7%

AWS Rekognition

Age 13-21
Gender Female, 79.2%
Calm 95.6%
Angry 1.1%
Sad 1%
Surprised 0.7%
Fear 0.7%
Confused 0.5%
Disgusted 0.2%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94%
Chair 85.3%

Captions

Microsoft

a group of people standing in front of a store 55.9%
a group of people in front of a store 55.4%
a group of people in a room 55.3%

Text analysis

Amazon

Studio
Gene's Studio
Gene's

Google

Stadie
GonE
GonE Stadie