Human Generated Data

Title

Untitled (arrangement of objects including a bag of cement and a hat)

Date

1917

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11751

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (arrangement of objects including a bag of cement and a hat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1917

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11751

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 98.8
Apparel 98.8
Person 82.9
Human 82.9
Furniture 80.1
Footwear 79.8
Person 76.7
Shoe 75.6
Person 69.9
Person 60.1
Bed 57.9
Shorts 56.7
Bag 56.6

Clarifai
created on 2023-10-25

people 99.6
monochrome 98.1
wear 97.1
street 96.2
man 94.9
one 93.9
adult 93.3
group together 91.5
furniture 87.7
administration 85.4
outfit 85.3
vehicle 81.9
military 80.2
woman 80.1
war 75.9
action 74.1
recreation 73.3
outerwear 72.9
group 72.4
art 72.1

Imagga
created on 2022-01-15

man 29.6
person 24.4
newspaper 21.4
people 20.1
male 19.9
work 18.8
product 17.6
adult 16.9
business 15.8
job 15
worker 14.5
equipment 14.4
working 14.1
shop 13.7
home 13.6
creation 12.9
black 12.9
occupation 12.8
professional 12
office 11.6
lifestyle 11.6
interior 11.5
computer 11.2
sitting 11.2
men 11.2
laptop 11.1
device 10.9
technology 10.4
table 9.9
indoors 9.7
casual 9.3
hand 9.1
attractive 9.1
sexy 8.8
businessman 8.8
happy 8.8
boy 8.7
clothing 8.7
industry 8.5
portrait 8.4
pretty 8.4
fashion 8.3
holding 8.3
human 8.2
hat 7.8
smile 7.8
corporate 7.7
room 7.6
one 7.5
floor 7.4
machine 7.3
helmet 7.2
suit 7.2
book 7.2
paper 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.5
black and white 94.4
ship 85.4
monochrome 57.8
house 51.3
clothes 17.4
cluttered 15.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Male, 98.9%
Sad 46.6%
Fear 42.8%
Calm 3.3%
Confused 3.2%
Disgusted 1.6%
Surprised 1.3%
Angry 0.7%
Happy 0.4%

Feature analysis

Amazon

Person 82.9%

Categories

Captions

Microsoft
created on 2022-01-15

text 13.2%

Text analysis

Amazon

JO
KE
MA
ALF
LEV
НЕ
WESTERN
DESERVE
94
DHA
I
830N3730
ICAL DESERVE
94 LOS NET
TRADE
ENON
CEMC
NET
32A8 YE33AB 830N3730
32A8
TRADE MARK RESISTERED
LOS
HON'S
ICAL
YE33AB
MARK RESISTERED
star

Google

НЕ JO KE LEV MA ALF AL DESERVE
LEV
MA
AL
НЕ
JO
KE
ALF
DESERVE