Human Generated Data

Title

Untitled (man going through tackle box on ground)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7317

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man going through tackle box on ground)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7317

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.1
Human 99.1
Clothing 96.2
Apparel 96.2
Shorts 93
Helmet 92.8
Building 73.7
People 62.5
Alloy Wheel 61.4
Machine 61.4
Wheel 61.4
Spoke 61.4
Factory 57.3

Clarifai
created on 2023-10-26

people 99.7
monochrome 98.2
man 98.1
adult 97.3
two 95.5
group together 93.3
vehicle 90.4
sitting 88.7
three 88.3
group 85.6
recreation 85.1
transportation system 81.9
wear 81.7
furniture 81.6
interaction 81.1
kneeling 80.7
sports equipment 75.2
baseball 75.1
competition 72.5
street 69.3

Imagga
created on 2022-01-15

dishwasher 84
white goods 65.7
home appliance 49.4
appliance 34.6
laptop 30.3
work 24.6
man 24.2
computer 22.5
people 22.3
technology 22.3
working 22.1
person 21.3
business 20
adult 16.8
durables 16.5
job 15.9
professional 15.2
male 14.2
equipment 13.1
office 12.8
worker 12.4
hand 12.2
newspaper 11.5
engineer 11.3
building 11.2
men 11.2
construction 11.1
looking 10.4
portrait 10.4
sky 10.2
lifestyle 10.1
communication 10.1
businesswoman 10
businessman 9.7
outdoors 9.7
plan 9.4
sitting 9.4
product 9
home 8.8
notebook 8.7
day 8.6
manager 8.4
studio 8.4
house 8.4
human 8.2
success 8
steel 8
women 7.9
architecture 7.8
modern 7.7
happy 7.5
holding 7.4
executive 7.4
occupation 7.3
metal 7.2
interior 7.1

Microsoft
created on 2022-01-15

outdoor 97.8
text 95.5
person 93.9
black and white 81.5
clothing 73.1

Color Analysis

Feature analysis

Amazon

Person 99.1%
Helmet 92.8%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

Tea
31119
Mayfair
-
COD

Google

31119
31119