Human Generated Data

Title

Untitled (outdoor picnic near well)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7811

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (outdoor picnic near well)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99
Person 98.1
Person 95.2
Person 89.8
Outdoors 88
Person 87.5
Nature 87.1
Person 86.2
Person 79.3
Animal 78.5
Mammal 78.5
Cattle 71.9
Bull 67
Ice 65.8
Plant 55.6

Imagga
created on 2022-01-09

gun 26.2
landscape 23.8
snow 23.1
sky 22.9
vehicle 22.4
weapon 22.4
winter 21.3
cannon 20.7
rifle 18
tree 17.7
artillery 17.4
tank 16.9
smoke 15.8
cold 15.5
old 15.3
firearm 15.1
military vehicle 14.9
scene 14.7
wheeled vehicle 14.5
building 14.5
tracked vehicle 14.4
house 14.2
outdoors 14.2
travel 14.1
park 14
armament 13.9
cloud 13.8
weaponry 12.5
outdoor 12.2
forest 12.2
field artillery 11.9
season 11.7
city 11.6
environment 11.5
high-angle gun 11.5
bench 11.4
conveyance 11.1
car 10.9
danger 10.9
trees 10.7
structure 10.6
armored vehicle 10.6
rural 10.6
sax 10.5
architecture 10.2
ice 10.1
water 10
snowy 9.7
steam 9.7
scenic 9.6
frost 9.6
frozen 9.5
light 9.4
field 9.2
countryside 9.1
vintage 9.1
scenery 9
river 8.9
antique 8.9
grunge 8.5
protection 8.2
road 8.1
frosty 7.8
military 7.7
fog 7.7
summer 7.7
construction 7.7
pollution 7.7
fire 7.5
man 7.4
vacation 7.4
street 7.4
industrial 7.3
color 7.2
dirty 7.2
black 7.2
holiday 7.2

Google
created on 2022-01-09

Table 93.2
Black 89.6
Organism 86.8
Black-and-white 84.1
Style 83.9
Motor vehicle 83.5
Plant 80.4
Adaptation 79.4
Art 78.6
Outdoor furniture 78.2
Desk 77.5
Font 74.8
Monochrome photography 74.4
Monochrome 74.3
Rectangle 72.7
Classic 72.3
Machine 69.8
Working animal 66.6
Tree 66.1
Illustration 65.4

Microsoft
created on 2022-01-09

outdoor 98.1
text 91.9
black and white 89.2
house 76.6
table 66.9
furniture 63.1

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 72.3%
Calm 95%
Happy 4.7%
Disgusted 0.1%
Surprised 0.1%
Sad 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Male, 96.6%
Sad 28.6%
Confused 17.9%
Calm 17.1%
Disgusted 11%
Happy 9.5%
Angry 8.1%
Fear 4.6%
Surprised 3.2%

AWS Rekognition

Age 33-41
Gender Male, 92.4%
Calm 34.3%
Happy 25.6%
Sad 25.3%
Surprised 6%
Confused 3.6%
Fear 2.4%
Angry 1.4%
Disgusted 1.4%

AWS Rekognition

Age 29-39
Gender Male, 69.5%
Calm 92.2%
Happy 5.7%
Sad 1.3%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 99.1%
Happy 37%
Calm 20.1%
Fear 12%
Confused 9.6%
Disgusted 8.2%
Sad 6.5%
Surprised 4%
Angry 2.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people riding on the back of a horse 52%
a person riding a horse in the rain 45.5%
a group of people riding horses on a city street 45.4%

Text analysis

Amazon

41208

Google

YT37A°2
-
XAGOX
MJI7-- YT37A°2 - - XAGOX
MJI7--