Human Generated Data

Title

Untitled (family eating a picnic out of the back of their car)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8953

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family eating a picnic out of the back of their car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8953

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.6
Chair 98.1
Furniture 98.1
Person 97.4
Person 92.8
Clothing 87.8
Apparel 87.8
Shorts 69.8
Pants 69.4
People 65.9
Transportation 63.7
Photography 63.3
Photo 63.3
Vehicle 61.7
Soldier 60.6
Military 60.6
Military Uniform 60.6
Drawing 59.8
Art 59.8
Brick 58.4
Wall 57.4
Urban 55.1

Clarifai
created on 2023-10-26

people 99.9
group together 99
adult 98.8
vehicle 98.6
group 97.6
military 97.5
man 97.3
transportation system 96.4
war 96.2
soldier 93.2
three 90.9
child 90.7
sitting 89.8
gun 89.6
uniform 89.3
two 88.2
outfit 88
woman 87.9
administration 85.3
wear 85

Imagga
created on 2022-01-09

musical instrument 49.4
accordion 41.7
keyboard instrument 36.5
wind instrument 28.2
man 22.2
person 21.6
chair 19.4
working 18.6
people 17.8
wheelchair 17.8
equipment 17.5
male 17
adult 14.9
work 14.3
grass 13.4
vehicle 13.4
tool 12.1
machine 11.9
seat 11.7
industry 11.1
device 11
black 10.8
outdoors 10.4
old 10.4
men 10.3
industrial 10
lawn mower 9.9
holding 9.9
sitting 9.4
wheeled vehicle 9.3
business 9.1
technology 8.9
style 8.9
home 8.8
lifestyle 8.7
day 8.6
furniture 8.6
briefcase 8.6
barrow 8.4
portrait 8.4
outdoor 8.4
computer 8.4
handcart 8.4
power 8.4
house 8.4
laptop 8.3
fashion 8.3
park 8.2
clothing 8.2
suit 8.1
gas 7.7
construction 7.7
attractive 7.7
two 7.6
wheel 7.5
smoke 7.4
safety 7.4
occupation 7.3
danger 7.3
smiling 7.2
sexy 7.2
road 7.2
worker 7.1
job 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 94.5
person 85.3
black and white 72.9
clothing 72.2
drawing 70.6
man 58.8
old 42

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 56.5%
Calm 69.3%
Sad 23.5%
Fear 4.5%
Happy 1%
Disgusted 0.6%
Angry 0.5%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 25-35
Gender Male, 95.6%
Calm 83.4%
Happy 7.7%
Sad 7.6%
Disgusted 0.4%
Confused 0.3%
Fear 0.2%
Angry 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Google

MJI7--YT3GA T
MJI7--YT3GA
T