Human Generated Data

Title

Untitled (barbecue for the American Institute of Baking)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8889

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (barbecue for the American Institute of Baking)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8889

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Person 98.5
Person 96.8
Person 95.5
Furniture 94.6
Chair 90.6
Person 90.2
Person 86.2
Outdoors 86.1
Person 85
Nature 83.5
Couch 83.1
Person 82.8
Plant 81.6
Tree 81.5
Person 80.5
Bird 79.7
Animal 79.7
Table 79.4
Dining Table 76.8
Yard 75.7
Meal 75.2
Food 75.2
Living Room 72.9
Indoors 72.9
Room 72.9
People 68.6
Apparel 61.2
Clothing 61.2
Military Uniform 56.9
Military 56.9
Grass 55.9
Electronics 55.5
Screen 55.5

Clarifai
created on 2023-10-26

people 99.8
many 96.9
group 96.3
group together 95.9
man 95.9
adult 95.3
vehicle 94.8
military 93.1
war 92.6
monochrome 91.8
several 87.4
street 86.5
furniture 86.1
cavalry 84.7
home 83.3
soldier 82
no person 80.5
leader 79.1
tree 78.9
snow 78.7

Imagga
created on 2022-01-15

barbecue 20.8
old 20.2
gun 17.5
rifle 17.3
dirty 14.4
bench 14.1
tree 14.1
building 13.8
grunge 13.6
firearm 12.7
black 12.6
retro 12.3
weapon 12.2
scene 12.1
negative 11.9
industrial 11.8
vintage 11.6
park 11.5
sky 11.5
snow 11.3
travel 11.3
stone 11.2
structure 11.1
antique 10.4
winter 10.2
space 10.1
outdoors 9.9
environment 9.9
river 9.8
room 9.7
mask 9.6
forest 9.6
man 9.4
film 9.4
smoke 9.3
city 9.1
art 9.1
danger 9.1
violin 9.1
material 9
landscape 8.9
color 8.9
textured 8.8
fog 8.7
stringed instrument 8.6
dirt 8.6
season 8.6
weathered 8.5
industry 8.5
outdoor 8.4
house 8.4
dark 8.3
pattern 8.2
rough 8.2
vehicle 8
yellow 7.9
bowed stringed instrument 7.8
architecture 7.8
male 7.8
cloud 7.7
cold 7.7
texture 7.6
grungy 7.6
part 7.4
water 7.3
protection 7.3
wall 7.2
scenery 7.2
person 7.2
history 7.2
paper 7.1
autumn 7

Google
created on 2022-01-15

Table 91.1
Black 89.5
Organism 86.6
Black-and-white 85.5
Working animal 85.3
Style 83.9
Chair 82.5
Art 79.9
Adaptation 79.3
Font 78.3
Monochrome photography 75.8
Monochrome 75.2
Plant 74.2
Rectangle 73.4
Painting 71.4
Coffee table 70.5
Desk 68.6
Classic 68.5
Motor vehicle 68.4
Tree 68.1

Microsoft
created on 2022-01-15

text 97.9
outdoor 93.1
black and white 87
furniture 85.9
table 79.7
chair 67.2
house 53

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 89.3%
Calm 99.3%
Sad 0.4%
Happy 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 97.9%
Calm 98.9%
Sad 0.6%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 29-39
Gender Male, 98.6%
Calm 51.6%
Surprised 25.5%
Happy 13.5%
Angry 2.8%
Confused 2.4%
Sad 1.9%
Fear 1.2%
Disgusted 1%

AWS Rekognition

Age 20-28
Gender Male, 50.4%
Happy 50.5%
Fear 35%
Sad 10%
Calm 2.8%
Disgusted 0.7%
Confused 0.5%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 25-35
Gender Male, 88.7%
Calm 72.5%
Happy 8%
Angry 6.6%
Disgusted 4.7%
Sad 3.8%
Fear 1.6%
Surprised 1.4%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Bird 79.7%

Text analysis

Amazon

41210
3.
KODAK--A-ITW

Google

--
3.
MJI7--YT3RA°2 -- XAGON 3.
MJI7--YT3RA°2
XAGON