Human Generated Data

Title

Untitled (men in suits and hats seated outside circus tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8493

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men in suits and hats seated outside circus tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.2
Person 99.2
Clothing 94.8
Apparel 94.8
Person 85.4
Outdoors 83.5
Tent 82.3
People 74.3
Leisure Activities 74
Camping 68.9
Shorts 64.8
Meal 64.7
Food 64.7
Nature 64.6
Person 62.3
Person 59.5
Suit 59.4
Coat 59.4
Overcoat 59.4
Plant 58.8

Imagga
created on 2022-01-15

canvas tent 100
outdoor 16
sky 15.3
sunset 15.3
sun 13.7
tent 13.5
water 13.3
forest 13.1
outdoors 13
people 12.8
travel 12.7
landscape 12.6
tree 12.3
person 12.3
camping 11.8
grass 11.1
beach 11
summer 10.9
park 10.7
adult 10.4
dark 10
silhouette 9.9
animal 9.9
adventure 9.5
sport 9.2
camp 8.9
scenic 8.8
equipment 8.7
man 8.7
light 8.7
holiday 8.6
sea 8.6
clouds 8.4
sunrise 8.4
black 8.4
evening 8.4
field 8.4
leisure 8.3
vacation 8.2
dress 8.1
trees 8
horse 8
rural 7.9
tourism 7.4
freedom 7.3
activity 7.2
sports equipment 7.1
portrait 7.1
night 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 99.6
clothing 93.7
person 91.5
text 90.5
man 89.1
standing 82.4
posing 76.6
player 74.3
old 72.1
black 69
tent 67.3

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 86.6%
Fear 89.4%
Calm 4.2%
Happy 3.9%
Surprised 0.9%
Confused 0.7%
Sad 0.5%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 27-37
Gender Male, 99.5%
Calm 94.3%
Surprised 4.6%
Disgusted 0.4%
Confused 0.3%
Happy 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.3%
Happy 74.7%
Calm 10%
Surprised 8.4%
Confused 2.4%
Sad 2.1%
Angry 1.1%
Fear 0.9%
Disgusted 0.5%

AWS Rekognition

Age 35-43
Gender Male, 86.8%
Calm 83.8%
Angry 5.9%
Surprised 5.3%
Happy 1.6%
Sad 1.4%
Confused 1.1%
Disgusted 0.6%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Male, 97.6%
Calm 98.1%
Sad 0.5%
Fear 0.4%
Surprised 0.3%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 18-26
Gender Male, 97.9%
Calm 82.5%
Fear 7.9%
Happy 2.8%
Sad 2.6%
Angry 1.8%
Disgusted 1.3%
Surprised 0.7%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tent 82.3%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 96.6%
a vintage photo of a group of people posing for a picture 96.5%
a vintage photo of a group of people standing in front of a building 95.7%

Text analysis

Amazon

16048
VAOOY
OPA
ИАМТ242
TAح

Google

16048. TGO 4 8.
16048.
TGO
4
8.