Human Generated Data

Title

Untitled (tour group look over balustrade in classical courtyard)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11302

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (tour group look over balustrade in classical courtyard)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11302

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 96.1
Human 96.1
Person 93.6
Person 92.5
Architecture 86.9
Building 86.9
Person 82.8
Person 77.3
Person 74.7
Person 73.4
Person 71.4
Art 70.7
Person 69
Clothing 68.6
Apparel 68.6
Person 68.4
Sculpture 63.3
People 61.6
Temple 60.6
Plant 60.5
Pillar 60.4
Column 60.4
Archaeology 59.9
Crypt 58.4
Theme Park 58.2
Amusement Park 58.2
Person 58
Shrine 57.6
Worship 57.6
Person 57.5
Handrail 56.7
Banister 56.7
LCD Screen 56.6
Electronics 56.6
Screen 56.6
Monitor 56.6
Display 56.6
Ice 55.9
Nature 55.9
Outdoors 55.9
Person 55.7
Person 52.8

Clarifai
created on 2023-10-25

people 99.8
many 98.9
group 97.9
adult 97.1
man 94.7
child 94.1
woman 91
administration 89.4
group together 88.7
wear 86.2
one 83.8
indoors 83.6
ceremony 82
recreation 81.7
no person 79.7
music 78.3
crowd 77
art 76.3
musician 75.9
several 75.1

Imagga
created on 2022-01-09

fountain 69.9
structure 57.4
architecture 39.7
sculpture 33.4
building 31.7
statue 27.9
stone 27.8
monument 26.1
landmark 25.3
history 25
old 24.4
ancient 24.2
city 23.3
travel 19
historical 18.8
art 17.9
memorial 17.7
historic 17.4
famous 16.7
water 16.7
tourism 16.5
temple 16.1
marble 15.3
sky 14.7
column 13.9
palace 13.9
culture 13.7
religion 13.4
tourist 12.8
river 11.6
god 10.5
carving 10.4
destination 10.3
wall 10.1
facade 10
cemetery 9.8
capital 9.5
park 9.4
construction 9.4
outdoor 9.2
urban 8.7
religious 8.4
church 8.3
gravestone 8.2
vacation 8.2
landscape 8.2
scenery 8.1
house 8.1
detail 8
lion 7.8
architectural 7.7
cathedral 7.6
fence 7.6
symbol 7.4
exterior 7.4
street 7.4
national 7.2
tower 7.2
night 7.1
shop 7.1

Google
created on 2022-01-09

Black 89.6
Tree 86.5
Black-and-white 85.3
Style 83.8
Font 82.4
Arecales 80.5
Adaptation 79.2
Art 77.7
Monochrome photography 74.1
Plant 73.6
Monochrome 72.1
Visual arts 67.7
Rectangle 67.2
Symmetry 66.3
History 64.1
Palm tree 61.8
Holy places 60.4
Pattern 57.6
Arch 57.6
Still life photography 56.3

Microsoft
created on 2022-01-09

text 93
black and white 65
store 46.2
altar 17.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 75.8%
Calm 90.2%
Sad 3.9%
Disgusted 1.3%
Happy 1.2%
Confused 0.9%
Surprised 0.9%
Fear 0.9%
Angry 0.8%

AWS Rekognition

Age 30-40
Gender Male, 96%
Calm 97.1%
Happy 1%
Sad 0.6%
Disgusted 0.5%
Confused 0.3%
Fear 0.2%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 37-45
Gender Male, 81.4%
Calm 86.2%
Happy 5.8%
Disgusted 3%
Sad 1.3%
Angry 1.1%
Confused 1.1%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 89.6%
Calm 95%
Happy 2.9%
Sad 0.9%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 20-28
Gender Female, 81.3%
Calm 54.7%
Happy 20.3%
Sad 10.8%
Fear 9.8%
Angry 1.7%
Disgusted 1.3%
Confused 0.8%
Surprised 0.6%

AWS Rekognition

Age 41-49
Gender Male, 94.3%
Calm 65.1%
Happy 33.6%
Surprised 0.4%
Sad 0.3%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 53.3%
Happy 98.9%
Calm 0.7%
Sad 0.2%
Surprised 0.1%
Fear 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 23-31
Gender Female, 84.6%
Happy 40.3%
Calm 35.8%
Sad 9.7%
Angry 5.7%
Surprised 5.1%
Disgusted 1.4%
Fear 1.1%
Confused 0.9%

AWS Rekognition

Age 23-31
Gender Female, 96.9%
Calm 49%
Happy 22.1%
Sad 19.4%
Angry 3%
Confused 2.7%
Disgusted 1.4%
Surprised 1.3%
Fear 1.2%

AWS Rekognition

Age 27-37
Gender Male, 79.1%
Calm 46.4%
Fear 18.7%
Happy 14.7%
Surprised 8.7%
Disgusted 6%
Confused 2.3%
Sad 1.8%
Angry 1.5%

AWS Rekognition

Age 16-24
Gender Female, 99.5%
Fear 32.8%
Calm 31.6%
Sad 22%
Disgusted 3.8%
Happy 3.7%
Angry 2.9%
Surprised 1.8%
Confused 1.4%

AWS Rekognition

Age 37-45
Gender Female, 87.7%
Calm 98.6%
Surprised 0.7%
Sad 0.3%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 84.6%
Happy 68.3%
Calm 22%
Fear 2.9%
Sad 2%
Disgusted 1.9%
Angry 1.3%
Surprised 1.1%
Confused 0.5%

AWS Rekognition

Age 26-36
Gender Female, 67.3%
Calm 83.4%
Happy 11.9%
Fear 1.8%
Sad 0.8%
Angry 0.8%
Surprised 0.7%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 18-24
Gender Male, 94.4%
Calm 73%
Confused 11.1%
Sad 6%
Happy 2.3%
Surprised 2.3%
Angry 1.9%
Disgusted 1.9%
Fear 1.5%

AWS Rekognition

Age 41-49
Gender Female, 68.5%
Calm 57.1%
Sad 28.1%
Happy 6.2%
Angry 4.7%
Confused 1.7%
Disgusted 0.8%
Fear 0.8%
Surprised 0.5%

AWS Rekognition

Age 29-39
Gender Female, 92.2%
Calm 68.2%
Happy 15%
Sad 9.1%
Fear 3.9%
Angry 2%
Disgusted 0.9%
Confused 0.6%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 96.1%

Categories

Captions

Microsoft
created on 2022-01-09

a display in a store 60.7%