Human Generated Data

Title

Untitled (wedding guests standing outside of church)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10732

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing outside of church)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.7
Person 99.7
Person 99.7
Person 99.4
Person 99
Person 98.7
Person 98.4
Person 98.1
Person 97.1
Apparel 95.5
Footwear 95.5
Shoe 95.5
Clothing 95.5
Person 95.5
Person 92.7
Person 92
Person 91.5
Person 86.8
Person 85.6
Wheel 81.6
Machine 81.6
Water 79.2
Porch 78.4
People 71.3
Outdoors 66.9
Person 62.3
Person 62.1
Nature 59.4
Shorts 58.4
Patio 57.8
Urban 56.4
Bus Stop 55.2
Person 41.9

Imagga
created on 2022-01-15

cemetery 38
building 27.7
architecture 22.8
city 22.4
gate 20.3
turnstile 19.7
house 17.5
urban 16.6
old 15.3
scene 14.7
travel 14.1
street 13.8
people 12.8
structure 12.5
movable barrier 12.4
barrier 12.1
home 12
world 11
window 10.8
shop 10.8
transportation 10.8
wall 10.7
park 10.7
business 10.3
light 10
transport 9.1
road 9
barbershop 9
landscape 8.9
night 8.9
residence 8.8
fence 8.7
mercantile establishment 8.6
culture 8.5
walking 8.5
tree 8.5
black 8.4
station 8.4
wood 8.3
danger 8.2
dirty 8.1
new 8.1
religion 8.1
art 7.8
palace 7.7
construction 7.7
crowd 7.7
buildings 7.6
silhouette 7.4
industrial 7.3
group 7.3
history 7.2
steel 7.1
rural 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 98.5
tree 98.2
person 85.7
text 85.6
black and white 82.7
clothing 71.3
man 70

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 97%
Happy 94.2%
Calm 1.8%
Sad 1.4%
Angry 0.9%
Disgusted 0.9%
Surprised 0.5%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 48-56
Gender Male, 89.7%
Happy 79.8%
Disgusted 11.9%
Sad 6%
Angry 0.6%
Surprised 0.5%
Calm 0.5%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Female, 74%
Happy 97.5%
Calm 1.7%
Sad 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0%

AWS Rekognition

Age 25-35
Gender Male, 92.9%
Calm 98%
Sad 1.2%
Confused 0.2%
Disgusted 0.2%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Female, 69.9%
Calm 98.5%
Surprised 0.4%
Fear 0.4%
Sad 0.3%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 22-30
Gender Female, 68.3%
Sad 71.2%
Calm 23.1%
Confused 3%
Angry 1.1%
Disgusted 0.5%
Fear 0.4%
Surprised 0.3%
Happy 0.3%

AWS Rekognition

Age 29-39
Gender Male, 97.9%
Happy 55.2%
Fear 27.2%
Calm 7.1%
Disgusted 3.5%
Angry 2.1%
Sad 1.9%
Surprised 1.7%
Confused 1.5%

AWS Rekognition

Age 28-38
Gender Female, 75.6%
Calm 53.2%
Happy 28.4%
Sad 15.9%
Confused 0.9%
Angry 0.5%
Disgusted 0.5%
Fear 0.4%
Surprised 0.3%

AWS Rekognition

Age 29-39
Gender Male, 98.8%
Calm 70.6%
Happy 27.9%
Sad 0.6%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 16-22
Gender Female, 98.9%
Sad 26.7%
Calm 25.6%
Fear 21.1%
Surprised 12.6%
Disgusted 7.7%
Happy 2.9%
Angry 2.6%
Confused 0.8%

AWS Rekognition

Age 34-42
Gender Male, 62.8%
Calm 95.7%
Sad 1.8%
Happy 0.9%
Fear 0.6%
Surprised 0.3%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 35-43
Gender Male, 98.6%
Calm 97%
Sad 2%
Confused 0.4%
Disgusted 0.2%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 95.5%
Wheel 81.6%

Captions

Microsoft

a group of people standing in front of a building 93.1%
a group of people walking in front of a building 93%
a group of people that are standing in front of a building 89.6%

Text analysis

Amazon

ass
8.58.5E
YТ3А-X

Google

asA
YT37A2-
• • ... asA YT37A2- AGON
...
AGON