Human Generated Data

Title

Untitled (storefront window, menswear, Schaffner & Marx)

Date

c. 1940s

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1503

Human Generated Data

Title

Untitled (storefront window, menswear, Schaffner & Marx)

People

Artist: John Deusing, American active 1940s

Date

c. 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1503

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Shop 98.9
Person 97.5
Human 97.5
Person 97.1
Window Display 94
Person 93.9
Clothing 93.1
Apparel 93.1
Boutique 68.4
Person 62.6
Long Sleeve 59.2
Sleeve 59.2
Coat 58.4
Bakery 56.4
Person 47.9

Clarifai
created on 2023-10-15

people 99
monochrome 96.8
scientist 95.5
science 95
technology 94.8
adult 93.4
vehicle 92.8
man 92.8
furniture 91.8
laboratory 91.4
indoors 88
commerce 86.8
research 85.2
industry 85.2
one 83.6
watercraft 83.6
group 83.5
two 82.9
invention 82.8
option 81.3

Imagga
created on 2021-12-14

case 37.3
dishwasher 30.3
equipment 27.8
white goods 22.5
technology 21.5
sketch 18.5
drawing 17.5
design 17.4
home appliance 17
architecture 16.5
appliance 15.9
house 15.9
construction 15.4
building 13.5
plan 13.2
electronic equipment 13.1
home 12.8
hardware 12.5
device 12
glass 12
architect 11.6
science 11.6
business 11.5
diagram 11.5
computer 11.3
modern 11.2
industry 11.1
development 10.6
working 10.6
project 10.6
blueprint 9.8
digital 9.7
close 9.7
electronics 9.5
art 9.3
finance 9.3
data 9.1
old 9
paper 8.6
plastic 8.3
graphic 8
idea 8
interior 8
conceptual 7.9
3d 7.7
structure 7.7
apartment 7.7
money 7.7
engineering 7.6
hand 7.6
tube 7.5
network 7.4
retro 7.4
detail 7.2
history 7.1
work 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 94.9
indoor 87.4
person 76.9
clothing 76.2
white 71.9
black 65.2
old 57.1
store 31.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-23
Gender Female, 80.4%
Calm 76.9%
Happy 9.8%
Sad 3.8%
Angry 2.2%
Surprised 2.1%
Confused 2%
Fear 1.9%
Disgusted 1.2%

AWS Rekognition

Age 23-37
Gender Female, 57.2%
Calm 90.9%
Sad 4%
Happy 2.6%
Confused 0.9%
Surprised 0.6%
Angry 0.4%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 34-50
Gender Male, 54.2%
Calm 68.3%
Sad 10%
Happy 9.1%
Angry 7.7%
Surprised 1.6%
Disgusted 1.2%
Confused 1.1%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Categories

Imagga

paintings art 99.4%

Text analysis

Amazon

Marx
Schaffner
&
rt Schaffner & Marx
with
T
rt
emily NEW with HANY BEST
HANY
NEW
BEST
emily
DO

Google

rt
&
Marx
rt Schaffner & Marx
Schaffner