Human Generated Data

Title

Untitled (women working at conveyor belt in factory)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12171

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women working at conveyor belt in factory)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12171

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.2
Human 99.2
Person 98.9
Person 95.4
Furniture 93.5
Building 85.3
Person 80.4
Indoors 69.4
Room 69.4
People 67.5
Sports 65.3
Skateboard 65.3
Sport 65.3
Factory 61.5
Clinic 59.8

Clarifai
created on 2023-10-26

people 99.5
adult 98.6
monochrome 98.4
woman 97.5
man 97
transportation system 96.1
sitting 94.9
group 93.1
child 91.4
vehicle 90.6
wear 88.5
group together 86.4
veil 85.5
sit 84.1
lid 84
adolescent 83.1
recreation 80.9
watercraft 80.3
indoors 79.2
reclining 78.8

Imagga
created on 2022-01-22

sketch 17.6
art 17.3
comic book 17
drawing 16.5
man 15.5
person 14.3
automaton 13.1
people 12.8
male 12.1
negative 11.1
representation 10.8
face 10.6
business 10.3
grunge 10.2
silhouette 9.9
device 9.9
style 9.6
black 9.6
design 9.6
costume 9.5
party 9.5
city 9.1
film 8.9
mask 8.8
symbol 8.8
lifestyle 8.7
astronaut 8.4
modern 8.4
adult 8.4
fashion 8.3
equipment 8.2
paint 8.1
cartoon 8
work 7.9
portrait 7.8
men 7.7
human 7.5
sport 7.4
newspaper 7.4
reflection 7.2
team 7.2

Google
created on 2022-01-22

Outerwear 95.5
Shirt 93.9
Motor vehicle 86.7
Automotive design 84.4
T-shirt 79.8
Hat 78.7
Monochrome 70.5
Engineering 70.5
Table 70.4
Room 69.1
Art 69.1
Font 68.9
Desk 68.5
Monochrome photography 66.5
Chair 66.4
Machine 66.3
Crew 65.8
Team 61.7
Illustration 61.2
Photographic paper 58

Microsoft
created on 2022-01-22

text 99.8
book 98.3
drawing 84.5
sketch 78.6
person 77.2
clothing 71.7
black and white 59.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 74.8%
Calm 99.1%
Sad 0.7%
Happy 0.1%
Angry 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Skateboard 65.3%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

12404.
12404

Google

1240
12404:
12404.
1240 4 12404. N YIHA2-MAMIZA 12404: 12404.
4
N
YIHA2-MAMIZA