Human Generated Data

Title

Untitled (family eating a picnic out of the back of their car)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8952

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family eating a picnic out of the back of their car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8952

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99
Human 99
Person 98.6
Person 97.9
Person 77.5
Art 73.5
Transportation 73
Vehicle 71.9
Shorts 60.3
Clothing 60.3
Apparel 60.3
Painting 58.8

Clarifai
created on 2023-10-25

people 99.8
vehicle 99.2
military 99
war 98.9
adult 98.4
group together 97.5
transportation system 96.3
military vehicle 96
weapon 95.8
soldier 95.7
skirmish 95.7
gun 95.7
tank 94.8
two 92.2
man 92
group 88.4
army 87.8
outfit 83.6
uniform 83.5
machine 80

Imagga
created on 2022-01-09

musical instrument 63.9
accordion 59.2
keyboard instrument 48.3
wind instrument 37.9
man 22.9
person 20.7
people 20.1
wheeled vehicle 18.8
vehicle 18.6
male 17.7
adult 16.2
outdoors 14.3
portrait 13.6
working 13.3
sitting 12.9
handcart 12.7
outdoor 12.2
danger 11.8
lifestyle 11.6
dirty 10.8
barrow 10.8
work 10.6
tool 10.5
machine 10.5
old 10.5
water 10
holding 9.9
park 9.9
black 9.6
tricycle 9.6
grass 9.5
men 9.4
equipment 9.3
tree 9.2
transportation 9
world 8.4
smoke 8.4
landscape 8.2
lady 8.1
sunset 8.1
worker 8
day 7.8
outside 7.7
industry 7.7
power 7.6
conveyance 7.5
clothing 7.4
vacation 7.4
safety 7.4
occupation 7.3
transport 7.3
teenager 7.3
road 7.2
chain saw 7.1
summer 7.1
music 7.1
rural 7.1

Microsoft
created on 2022-01-09

person 95.3
clothing 88.9
text 87.9
black and white 87.4
man 77.4
old 40.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 68%
Calm 89.8%
Sad 4.8%
Happy 1.7%
Confused 1.6%
Surprised 0.7%
Fear 0.5%
Angry 0.5%
Disgusted 0.4%

Feature analysis

Amazon

Person 99%

Categories

Captions

Microsoft
created on 2022-01-09

an old photo of a person 84.9%
an old photo of a girl 71.9%
old photo of a person 71.8%

Text analysis

Amazon

42324
MJ17--YT37A°--NX

Google

42324
42324