Human Generated Data

Title

Untitled (man next to trailer with fishing pole)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5060

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man next to trailer with fishing pole)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5060

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.4
Human 99.4
Clothing 83.2
Apparel 83.2
Outdoors 79.7
Leisure Activities 62
Water 60.8
Shorts 60.6
Female 59.6
Shoe 58.5
Footwear 58.5
Angler 57.9
Fishing 57.9

Clarifai
created on 2023-10-26

people 99.6
broom 99.4
housework 98.6
cleaner 98.5
one 98.2
tidy 96.9
adult 96.6
bucket 96
dust 95.9
job 95.6
maid 95.3
vehicle 95.1
monochrome 94.7
family 92.1
wear 90
house 88.3
man 88.2
vacuum 88
indoors 84.6
washer 83.4

Imagga
created on 2022-01-22

pay-phone 79.7
telephone 66.6
electronic equipment 49.8
equipment 37.9
washer 21.2
transportation 20.6
white goods 18.2
call 16.8
home appliance 16.7
car 16.6
cleaner 16.5
transport 15.5
appliance 15.4
person 15.1
work 14.9
adult 14.9
people 14.5
business 13.4
industry 12.8
station 12.6
man 12.1
automobile 11.5
office 11.5
smile 11.4
technology 11.1
working 10.6
vehicle 10.5
smiling 10.1
communication 10.1
happy 10
travel 9.8
job 9.7
interior 9.7
drive 9.4
inside 9.2
industrial 9.1
indoors 8.8
water 8.7
auto 8.6
machine 8.5
pretty 8.4
occupation 8.2
outdoors 8.2
worker 8
male 7.8
door 7.8
device 7.7
gas 7.7
room 7.6
clean 7.5
pump 7.5
holding 7.4
back 7.3
computer 7.2
building 7.2
home 7.2
life 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.1
outdoor 88.5
black and white 87.4
clothing 83.9
person 76.8
white 61.6
footwear 61.4
old 40.1

Color Analysis

Feature analysis

Amazon

Person 99.4%
Shoe 58.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

31117
TEAT

Google

31117
31117