Human Generated Data

Title

Untitled (barbecue for the American Institute of Baking)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8888

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (barbecue for the American Institute of Baking)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8888

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 98.5
Shorts 98.3
Clothing 98.3
Apparel 98.3
Person 97.9
Meal 97.8
Food 97.8
Person 97.3
Person 95.7
Person 93.8
Person 91
Outdoors 87.8
Plant 84.3
Nature 83
Tree 81.6
Person 79.5
Person 78.1
People 74.8
Grass 72.4
Person 70.8
Dish 69.1
Female 67.2
Face 67
Picnic 66.7
Leisure Activities 66.7
Vacation 66.7
Furniture 56.8

Clarifai
created on 2023-10-26

people 99.8
adult 98
man 97
group together 96.9
group 96.1
many 96
monochrome 95.8
war 94
military 93.2
flame 92.7
street 92.5
woman 91.1
wear 90.5
smoke 90.3
vehicle 89.8
tent 88.7
furniture 87.6
home 87.2
two 87.1
campsite 86.6

Imagga
created on 2022-01-15

sky 23.6
landscape 23
cloud 17.2
smoke 16.7
industrial 16.3
old 16
danger 15.5
device 15.1
scene 14.7
bench 14.2
man 14.1
environment 13.2
outdoor 13
winter 12.8
weapon 12.7
travel 12.7
dirty 12.6
factory 12.6
vehicle 12.4
sax 12.3
scenic 12.3
gun 12.2
water 12
industry 11.9
snow 11.8
steam 11.6
tree 11.5
park 11.5
outdoors 11.3
summer 10.9
protection 10.9
scenery 10.8
black 10.8
destruction 10.7
building 10.7
toxic 10.7
male 10.6
chemical 10.6
weaponry 10.4
clouds 10.1
power 10.1
people 10
structure 10
person 9.9
vintage 9.9
mask 9.6
dark 9.2
vacation 9
river 8.9
nuclear 8.7
forest 8.7
pollution 8.6
armament 8.6
architecture 8.6
dangerous 8.6
artillery 8.5
grunge 8.5
adult 8.4
tourism 8.2
rifle 8.2
retro 8.2
tourist 8.2
conveyance 8.1
light 8
lifestyle 7.9
rural 7.9
work 7.8
machine 7.8
billboard 7.8
season 7.8
protective 7.8
antique 7.8
space 7.8
field artillery 7.7
field 7.5
cloudy 7.5
natural 7.4
countryside 7.3
grass 7.1
trees 7.1
day 7.1
paper 7.1
torpedo 7

Google
created on 2022-01-15

Black 89.6
Organism 86.7
Black-and-white 85.2
Style 83.8
Adaptation 79.3
Font 79
Rectangle 78.8
Art 78.7
Tints and shades 77.3
Plant 76.3
Chair 76
Monochrome photography 75.6
Table 75.3
Monochrome 72.7
Vintage clothing 71.2
Room 70.9
Classic 65.1
Tree 64.6
Sitting 64.1
Visual arts 64

Microsoft
created on 2022-01-15

table 94.2
text 93.2
furniture 92
black and white 91.8
chair 73.8
monochrome 62.8
house 54.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 64.3%
Calm 99.9%
Happy 0%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Male, 94%
Calm 99.8%
Sad 0.1%
Surprised 0%
Confused 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 91.4%
Sad 45.1%
Calm 39.6%
Happy 6.6%
Fear 5.4%
Disgusted 1.3%
Angry 1%
Surprised 0.7%
Confused 0.4%

AWS Rekognition

Age 30-40
Gender Male, 95.5%
Calm 56.1%
Happy 16.2%
Surprised 6.3%
Fear 5.7%
Disgusted 5.4%
Sad 4.4%
Angry 4.1%
Confused 1.7%

AWS Rekognition

Age 24-34
Gender Male, 99.5%
Happy 90.9%
Sad 3.6%
Calm 1.8%
Confused 1.5%
Disgusted 0.9%
Fear 0.6%
Surprised 0.5%
Angry 0.3%

AWS Rekognition

Age 21-29
Gender Male, 74.4%
Sad 71.1%
Calm 18.2%
Confused 4.3%
Fear 2.4%
Happy 1.4%
Angry 1.2%
Disgusted 0.8%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft
created on 2022-01-15

text 60.2%

Text analysis

Amazon

41211
VAGOY
04
VT774°2

Google

41211
41211