Human Generated Data

Title

Untitled (men stirring large vats in factory)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12177

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men stirring large vats in factory)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 98.9
Person 98.9
Person 97.9
Person 88.3
Meal 78
Food 78
Building 75.7
Factory 71.6
Appliance 60.6
Dishwasher 60.4
Dish 59.6
Bowl 56.8

Imagga
created on 2022-01-22

container 53.9
glass 40.4
steel drum 33.2
measure 27.6
vessel 27
percussion instrument 26.8
drink 20.9
measuring cup 20.9
liquid 20.2
white goods 20.2
musical instrument 20.1
beaker 19.8
water 19.3
equipment 18.5
medical 17.6
glasses 17.6
bucket 17
transparent 17
experiment 16.6
laboratory 16.4
refrigerator 16.2
research 16.2
science 16
interior 15.9
clean 15.9
home appliance 15.7
chemical 15.5
chemistry 15.4
design 15.2
biology 15.2
beverage 14.9
glassware 14.7
lab 14.6
steel 14.1
medicine 13.2
food 12.9
metal 12.9
industry 12.8
pharmaceutical 12.7
modern 12.6
architecture 12.5
appliance 12.3
crystal 12.2
clear 12.2
cold 12
home 12
empty 11.7
bottle 11.5
aqua 11.4
kitchenware 11.1
jar 11
kitchen 10.9
h2o 10.8
biotechnology 10.8
pharmacy 10.7
stainless 10.7
scientific 10.7
cool 10.6
table 10.4
house 10
fluid 9.9
flask 9.8
cup 9.8
sample 9.7
test 9.6
object 9.5
construction 9.4
floor 9.3
tool 9.1
pot 9.1
restaurant 8.9
instrument 8.9
pharmacology 8.9
scientist 8.8
discovery 8.8
dishwasher 8.7
cooking 8.7
tube 8.7
luxury 8.6
development 8.5
alcohol 8.3
building 8.1
office 8
close 8
acid 7.9
clinical 7.8
analysis 7.8
party 7.7
health 7.6
room 7.6
drops 7.5
technology 7.4
environment 7.4
plastic 7.3
new 7.3
industrial 7.3
decoration 7.2
wet 7.1
drawing 7.1
silver 7.1
furniture 7.1

Google
created on 2022-01-22

Table 89.9
Kitchen appliance 88.7
Gas 76.5
Machine 74
Chair 73.9
Font 73.3
Monochrome 72.5
Monochrome photography 71.5
Cooking 70.8
Rectangle 69.3
Engineering 69.2
Major appliance 64.7
Room 62
Metal 61.5
Desk 60.6
Service 58.3
Shelf 54.5
Steel 53.5
Factory 50.9

Microsoft
created on 2022-01-22

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 91.6%
Calm 99.3%
Happy 0.3%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Female, 62.1%
Calm 74.5%
Happy 19.6%
Surprised 2.7%
Confused 1.4%
Disgusted 0.7%
Sad 0.6%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Male, 94.3%
Calm 48.9%
Sad 39.3%
Happy 6.2%
Angry 2.6%
Disgusted 1.3%
Fear 0.8%
Confused 0.6%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Text analysis

Amazon

97
12407
12407.
رب

Google

H
12407. H HA2-MAMICAS 12407. 12407.
12407.
HA2-MAMICAS