Human Generated Data

Title

Untitled (woman sorting through a substance on conveyor belt)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5119

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman sorting through a substance on conveyor belt)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.7
Person 99.7
Person 99.6
Apparel 95.9
Clothing 95.9
Person 91.5
Person 89.5
Person 83.8
Hat 80.7
People 74
Crowd 67.1
Face 64.2
Hat 62.7
Meal 62.3
Food 62.3
Hand 58.2

Imagga
created on 2022-01-23

patriotic 28.8
nation 28.4
flag 27.9
graphic 24.1
fabric 23.5
art 23.3
symbol 22.9
rippling 22.6
waving 22.4
patriotism 22.3
wavy 22.2
navigation 22.1
state 22
ripple 22
clip 21.9
users 21.7
homepage 21.7
domain 21.7
commerce 21.5
e commerce 21.5
search 21.2
country 21.1
wide 21.1
national 20.8
technology 20.8
website 20.8
site 20.6
world 20.5
web 19.5
business 18.2
sign 17.3
design 16.9
person 16.1
automaton 15.9
people 15.1
negative 12.4
sport 11.5
film 10.6
crowd 10.6
human 10.5
event 10.2
body 9.6
muscular 9.6
profile 9.6
athlete 9.4
3d 9.3
silhouette 9.1
player 9
audience 8.8
fight 8.7
anatomy 8.7
skill 8.7
biology 8.5
lights 8.3
competition 8.2
team 8.1
group 8.1
versus 7.9
render 7.8
dangerous 7.6
photographic paper 7.4
training 7.4
science 7.1
male 7.1
hand 7.1
medical 7.1

Google
created on 2022-01-23

Black 89.6
Human 89.1
Black-and-white 85.7
Hat 85.5
Style 84
Adaptation 79.2
Font 78.2
Monochrome 77.1
T-shirt 75.4
Monochrome photography 75.3
Shorts 71.9
Happy 71
Art 70.8
Event 70.2
Eyewear 67.6
Advertising 66.7
Room 66.6
Illustration 64.6
Photo caption 64.6
Stock photography 64.1

Microsoft
created on 2022-01-23

person 99.9
text 99.2
black and white 87
people 60.6
clothing 57

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Female, 76.5%
Calm 98.6%
Sad 0.6%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Fear 0%

Feature analysis

Amazon

Person 99.7%
Hat 80.7%

Captions

Microsoft

a group of people looking at each other 80.5%
a group of people sitting next to a window 71.4%
a group of people sitting in front of a window 70.1%

Text analysis

Amazon

12405
12405.

Google

12H05
12405. 12H05 12405.
12405.