Human Generated Data

Title

Untitled (interior of department store during sale)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6266

Human Generated Data

Title

Untitled (interior of department store during sale)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 93
Human 93
Building 90.4
Person 86.3
Person 84.3
Factory 76.4
Furniture 75
Restaurant 73.4
Cafeteria 72.7
Person 60.7
Table 59.7
Lab 59
Person 42.9

Imagga
created on 2022-01-22

restaurant 50.1
counter 44.2
interior 43.3
table 43.1
chair 33.3
room 29.3
furniture 27.4
dinner 23.7
cafeteria 23.3
building 22.5
modern 22.4
dining 21.9
glass 21.8
luxury 21.4
decor 21.2
service 19.4
shop 18.9
office 18.8
design 18.6
indoors 17.6
inside 17.5
hall 17.4
party 17.2
setting 16.4
banquet 16
drink 15.9
house 15.9
food 15.7
architecture 15.6
hotel 15.3
eat 15.1
bar 14.8
elegant 14.6
meal 13.9
decoration 13.7
empty 13.7
kitchen 13.7
home 13.6
center 13.3
business 12.8
catering 12.7
light 12.7
napkin 12.7
mercantile establishment 12.6
lunch 12.2
floor 12.1
wedding 12
reception 11.8
city 11.6
structure 11.5
salon 11.4
wine 11.3
place 11.2
indoor 11
tables 10.8
silverware 10.8
chairs 10.8
cutlery 10.8
knife 10.6
fork 10.5
style 10.4
contemporary 10.3
event 10.2
elegance 10.1
tablecloth 9.8
urban 9.6
celebration 9.6
comfortable 9.5
wood 9.2
stylish 9
seat 8.9
plates 8.8
wooden 8.8
flowers 8.7
formal 8.6
china 8.5
supermarket 8.1
place of business 7.9
stool 7.9
work 7.8
nobody 7.8
plate 7.6
vacation 7.4
barroom 7.3
dish 7.2
silver 7.1
travel 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

black and white 97.6
skyscraper 91.3
city 90.6
text 89.4
indoor 85.6
building 83.3
sky 83
ship 81.4
monochrome 68.2
black 67.1

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 81.2%
Sad 47.6%
Calm 44.7%
Happy 4.9%
Angry 0.7%
Fear 0.6%
Confused 0.6%
Disgusted 0.5%
Surprised 0.3%

AWS Rekognition

Age 29-39
Gender Male, 78.8%
Happy 43.2%
Calm 33.2%
Fear 10.1%
Sad 3.5%
Surprised 2.8%
Angry 2.7%
Disgusted 2.6%
Confused 1.8%

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Calm 29.4%
Happy 18.6%
Sad 15.4%
Angry 11.6%
Disgusted 9.3%
Fear 7.1%
Surprised 4.8%
Confused 3.7%

Feature analysis

Amazon

Person 93%

Captions

Microsoft

a group of people in a room 85.5%
a group of people standing in a room 80.8%
an old photo of a group of people in a room 80.7%

Text analysis

Amazon

EXIT
CLOTHE
WORK CLOTHE
WEAR
WORK
RE
N
F
100
35
KODAKSLA
PARTITED
REBS SHOES-

Google

PARIDED- ENS SHOES- MORK CLOTHE O WEAR
ENS
O
MORK
PARIDED-
WEAR
SHOES-
CLOTHE