Human Generated Data

Title

Untitled (people at a reception under a striped tent)

Date

1948

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8360

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people at a reception under a striped tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.9
Person 98.9
Apparel 89
Clothing 89
Person 86.8
Crowd 83.3
Person 83.1
Person 81.3
Pedestrian 78.6
Person 75.4
Shorts 67.8
Person 66
Indoors 61.8
Path 61.4
Plant 61
Person 60.5
Female 60.5
Person 60.1
Floor 58.9
Face 58
Photo 57.4
Photography 57.4
Flooring 56.8
Tree 56.8
Advertisement 56.1
Person 52.5

Imagga
created on 2022-01-09

shop 50.8
mercantile establishment 32.9
toyshop 32.4
clothing 23.4
place of business 21.9
shoe shop 21.8
fashion 18.1
business 17.6
city 16.6
black 14.6
people 14.5
chandelier 13.8
boutique 12.5
man 12.3
style 11.9
art 11.8
person 11.7
model 11.7
urban 11.4
sexy 11.2
clothes 11.2
establishment 10.9
dress 10.8
lady 10.5
lighting fixture 10.3
covering 9.9
elegant 9.4
sale 9.2
travel 9.1
texture 9
retro 9
color 8.9
decoration 8.8
women 8.7
colorful 8.6
garment 8.5
design 8.4
portrait 8.4
case 8.3
body 8
market 8
indoors 7.9
wall 7.8
shoes 7.7
fixture 7.7
walking 7.6
pattern 7.5
human 7.5
world 7.4
street 7.4
shopping 7.3
group 7.2
adult 7.1
umbrella 7.1
hair 7.1
male 7.1
businessman 7.1
surface 7

Google
created on 2022-01-09

Black-and-white 84.9
Style 83.9
Monochrome photography 75
Art 73.2
Monochrome 73.1
Event 70.8
Font 65.6
Room 64.4
Crowd 63.7
Pole 62
Stock photography 61.9
Symmetry 60.2
Light fixture 59.6
Ceiling 57.3
Shorts 57.1
Pattern 55
Street 54.5
Metal 54.1
Visual arts 53.3
City 52.1

Microsoft
created on 2022-01-09

text 99.7
person 86.5
black and white 82
clothing 81.6
dance 81.5
group 60.6
people 57.7
posing 57.2
old 51.4
footwear 50.7
clothes 15.4

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 62.1%
Sad 70.4%
Calm 9%
Happy 7.1%
Fear 5.4%
Angry 3.4%
Surprised 2%
Disgusted 1.4%
Confused 1.2%

AWS Rekognition

Age 10-18
Gender Male, 85.5%
Calm 90.1%
Sad 5%
Happy 2.2%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%
Confused 0.5%
Surprised 0.4%

AWS Rekognition

Age 23-31
Gender Male, 99.2%
Sad 39.2%
Calm 35%
Disgusted 7.9%
Happy 6.9%
Angry 6%
Surprised 2.8%
Fear 1.2%
Confused 1.1%

AWS Rekognition

Age 25-35
Gender Female, 64.8%
Calm 57.2%
Happy 12.6%
Angry 11.5%
Sad 9.5%
Fear 3.4%
Disgusted 2.8%
Surprised 1.6%
Confused 1.4%

AWS Rekognition

Age 35-43
Gender Male, 90.7%
Happy 88.1%
Calm 6.9%
Sad 3%
Fear 0.6%
Angry 0.4%
Disgusted 0.4%
Surprised 0.3%
Confused 0.2%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people standing in front of a crowd posing for the camera 90.1%
a group of people standing in front of a crowd 90%
a group of people posing for a photo 89.9%

Text analysis

Amazon

or
МАУТВАД
NAGON
YT37A2