Human Generated Data

Title

Untitled (view of employees in factory)

Date

1907

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 4.2002.22214

Human Generated Data

Title

Untitled (view of employees in factory)

People

Artist: Unidentified Artist,

Date

1907

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.4
Person 99.4
Person 98.6
Person 93.5
Workshop 90.8
Building 88.9
Person 83.7
Person 82.6
Face 77.6
Clothing 75.6
Apparel 75.6
Art 66.7
Indoors 63.8
Person 60.5
Factory 58.9
Room 58.4
Person 44.7

Imagga
created on 2022-03-11

barbershop 49.9
shop 41.8
mercantile establishment 31.3
building 30.7
interior 30.1
chair 28.8
urban 27.1
architecture 25.1
transportation 25.1
inside 24.8
room 22.8
place of business 21.9
city 21.6
seat 20.6
modern 20.3
station 19.4
window 18.3
travel 18.3
business 17.6
indoors 17.6
industry 17.1
passenger 16.8
floor 16.7
train 16
steel 15.9
structure 15.5
factory 15.4
industrial 15.4
empty 14.6
furniture 14.3
glass 14
office 14
transport 13.7
metal 13.7
light 13.4
hall 13.1
airport 12.7
classroom 12.3
center 12.1
construction 12
work 11.9
subway 11.8
railway 11.8
establishment 11.7
salon 11.4
table 11.3
people 11.2
old 10.4
wheelchair 10.4
scene 10.4
vehicle 10.3
life 10.3
dairy 10.2
metro 9.9
ceiling 9.8
move 9.6
roof 9.5
man 9.4
equipment 9
design 9
reflection 8.9
technology 8.9
pipes 8.9
walkway 8.8
tube 8.7
engineering 8.6
house 8.4
hallway 7.9
railroad 7.9
corridor 7.9
rail 7.9
manufacturing 7.8
mall 7.8
busy 7.7
wall 7.7
school 7.6
power 7.6
fast 7.5
plant 7.5
place 7.4
car 7.4
indoor 7.3
bank 7.2
decor 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

building 99.2
text 94
drawing 86.3
black and white 68.4
factory 53.5
table 51.5
old 40.6
train 28.1
sale 10.5

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 97.5%
Happy 31.5%
Calm 27.1%
Disgusted 17.8%
Angry 11.6%
Sad 7.1%
Fear 2.9%
Surprised 1%
Confused 1%

AWS Rekognition

Age 23-31
Gender Male, 90.3%
Calm 65%
Happy 15.9%
Sad 10.2%
Fear 2.6%
Angry 2.3%
Disgusted 1.6%
Surprised 1.2%
Confused 1.1%

AWS Rekognition

Age 16-24
Gender Female, 98.1%
Calm 61.8%
Confused 9.3%
Sad 8.4%
Angry 7.9%
Fear 7.8%
Happy 2%
Disgusted 1.8%
Surprised 0.9%

AWS Rekognition

Age 28-38
Gender Male, 57.8%
Calm 78.6%
Sad 15.5%
Angry 1.9%
Happy 1.5%
Confused 0.7%
Fear 0.7%
Surprised 0.7%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people standing in front of a building 87.3%
a group of people standing in front of a store 79.3%
a group of people standing outside of a building 79.2%

Text analysis

Amazon

H.P.J.
-LO

Google

H.P. J.
H.P.
J.