Human Generated Data

Title

Untitled (outdoor picnic near well)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7812

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (outdoor picnic near well)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7812

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 98.8
Meal 96.6
Food 96.6
Person 95.8
Person 90.2
Tree 88.2
Plant 88.2
Person 88.1
Person 86.7
Person 80.6
Person 80.2
Person 80.2
Shorts 77.1
Clothing 77.1
Apparel 77.1
People 70.7
Dish 69.9
Cafeteria 69.8
Restaurant 69.8
Face 68.6
Female 66.1
Table 63.1
Furniture 63.1
Dining Table 62.8
Outdoors 59.5
Fir 57.8
Abies 57.8
Indoors 56.9
Room 56.9
Buffet 56.8
Nature 55.5
Eating 55.4

Clarifai
created on 2023-10-25

people 99.9
many 98.1
adult 97.8
monochrome 97.5
man 96.7
group 96.2
group together 95.7
street 93
woman 91.5
furniture 91.3
war 91.2
tent 90.9
food 90.8
military 89.3
market 89
wear 88.9
two 88.7
one 86.6
campsite 86.6
merchant 86.2

Imagga
created on 2022-01-09

billboard 18.2
sky 17.9
man 17.5
television 16.9
structure 15.1
people 15.1
signboard 14.7
person 13.8
black 13.2
male 12.8
landscape 12.6
bench 12.5
summer 11.6
equipment 10.9
outdoors 10.8
vehicle 10.6
stage 10.3
sport 10.1
leisure 10
outdoor 9.9
park 9.9
environment 9.9
cloud 9.5
dark 9.2
wheeled vehicle 9.2
old 9.1
lifestyle 8.7
adult 8.5
travel 8.4
clouds 8.4
field 8.4
seat 8.3
computer 8.2
building 8.2
happy 8.1
water 8
destruction 7.8
tree 7.8
grunge 7.7
platform 7.6
power 7.6
fun 7.5
one 7.5
smoke 7.4
light 7.4
back 7.3
motor vehicle 7.3
industrial 7.3
dirty 7.2
monitor 7.2
night 7.1
telecommunication system 7.1
work 7.1
sports equipment 7

Google
created on 2022-01-09

Food 90.8
Table 90.7
Black 89.5
Organism 85.5
Black-and-white 84.9
Style 83.9
Rectangle 83.2
Art 80.5
Adaptation 79.3
Chair 77.9
Tints and shades 77.3
Monochrome photography 76.1
Tree 74.2
Monochrome 73
Font 70.5
Cooking 68.3
Room 68.3
Plant 67.2
Vintage clothing 66.2
Outdoor furniture 65.9

Microsoft
created on 2022-01-09

furniture 96
text 95.7
table 95.2
black and white 91.2
chair 81.9
monochrome 61.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 97.9%
Calm 99.2%
Sad 0.6%
Happy 0.1%
Confused 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 95.2%
Happy 37.8%
Calm 18.5%
Sad 18.3%
Confused 14.9%
Surprised 5.2%
Angry 2.5%
Disgusted 1.7%
Fear 1%

AWS Rekognition

Age 26-36
Gender Female, 51.8%
Calm 99.6%
Happy 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 19-27
Gender Female, 67.4%
Calm 58.7%
Sad 17.2%
Disgusted 8.6%
Happy 6.3%
Confused 3.8%
Angry 2.5%
Fear 1.5%
Surprised 1.5%

AWS Rekognition

Age 27-37
Gender Male, 82.9%
Calm 94.9%
Happy 1.7%
Sad 1.7%
Disgusted 0.6%
Fear 0.4%
Confused 0.3%
Surprised 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

41209
VAGON
VT37092
نية

Google

41209 2.
41209
2.