Human Generated Data

Title

Untitled (person in hat standing at bulletin board)

Date

c. 1936

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4792

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (person in hat standing at bulletin board)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4792

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 99.5
Apparel 99.5
Person 98.7
Human 98.7
Train 86.8
Transportation 86.8
Vehicle 86.8
Hardhat 73.4
Helmet 73.4
Door 62.3
Photography 62.1
Photo 62.1
Portrait 60.9
Face 60.9
Astronaut 57.3
Home Decor 56.6
Coat 56.1
Flooring 55.5

Clarifai
created on 2023-10-27

people 99.9
adult 97.6
wear 96.7
military 96.6
one 96.3
war 94.8
uniform 94.8
outfit 94.7
vehicle 92.9
two 92.9
man 90.8
monochrome 90
leader 88
group 87.5
administration 85.7
group together 85.4
three 85
aircraft 81.8
veil 80.5
watercraft 80.3

Imagga
created on 2022-01-29

blackboard 23.7
man 22.8
film 20.5
people 20.1
technology 20
negative 18.9
work 18
male 17
equipment 15.3
barbershop 15
hand 14.6
professional 14.6
science 14.2
medical 14.1
digital 13.8
shop 13.6
business 13.4
person 13.1
photographic paper 12.9
information 12.4
medicine 12.3
computer 12
modern 11.9
room 11.3
doctor 11.3
education 11.2
art 11.1
device 10.7
research 10.5
screen 10.3
men 10.3
mercantile establishment 10
portrait 9.7
scientific 9.7
communication 9.2
global 9.1
worker 8.9
photographic equipment 8.8
scientist 8.8
adult 8.5
black 8.4
human 8.2
office 8.1
instrument 8.1
school 8.1
interior 8
design 7.9
hospital 7.7
laboratory 7.7
health 7.6
clock 7.6
power 7.6
writing 7.5
coat 7.5
network 7.4
care 7.4
display 7.4
data 7.3
music 7.2
world 7.1
working 7.1
businessman 7.1
life 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.2
black and white 88.3
person 82.8
clothing 77.9
piano 61.7
human face 50.4
posing 41.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 85.6%
Calm 49.4%
Surprised 25%
Fear 23.3%
Happy 1.1%
Disgusted 0.6%
Sad 0.3%
Confused 0.2%
Angry 0.2%

Feature analysis

Amazon

Person
Train
Person 98.7%

Categories

Imagga

paintings art 99.4%

Captions

Text analysis

Amazon

EVEETA
EVEETA BVSZ
BVSZ

Google

VEEIA BE
VEEIA
BE