Human Generated Data

Title

Untitled (women behind table with toys)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4511

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women behind table with toys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.8
Human 98.8
Person 97.7
Person 96.4
Person 94.6
Person 93.6
Person 92.4
Person 91
Person 90.9
Text 76.4
Photo Booth 71.3
Person 69.5
Room 68.6
Indoors 68.6
Dressing Room 68.2
Carousel 61.4
Amusement Park 61.4
Crowd 58.6
Lighting 57.5
Theme Park 56.7
Stage 55.4
People 55.4

Imagga
created on 2022-01-23

art 25.6
design 25.4
business 23.1
graphic 20.4
sign 19.6
tracing 19.2
silhouette 19
symbol 16.2
flag 15.9
grunge 15.3
finance 15.2
clip 14.9
party 13.8
retro 13.1
drawing 12.7
wavy 12.5
nation 12.3
fabric 12
rippling 11.8
national 11.8
waving 11.7
pattern 11.6
patriotism 11.6
patriotic 11.5
wallpaper 11.5
state 11.5
ripple 11.5
country 11.4
technology 11.1
money 11.1
world 10.9
black 10.8
people 10.6
success 10.5
decoration 10.4
investment 10.1
data 10
global 10
music 9.9
treasury 9.9
celebration 9.6
blackboard 9.5
flower 9.2
dance 9.2
element 9.1
financial 8.9
information 8.9
facility 8.8
man 8.8
disco 8.7
ornament 8.6
men 8.6
vintage 8.3
style 8.2
depository 8.2
paint 8.1
currency 8.1
stage 8.1
group 8.1
computer 8
idea 8
person 7.8
cloud 7.8
funky 7.7
dancing 7.7
modern 7.7
crowd 7.7
floral 7.7
international 7.6
creativity 7.4
banner 7.4
light 7.4
digital 7.3
card 7.3
night 7.1
architecture 7

Google
created on 2022-01-23

Dress 84.4
Table 84.4
Style 83.9
Black-and-white 82.9
Art 77
Tints and shades 76.6
Suit 74.2
Monochrome photography 73.9
Event 73.7
Font 71.9
Monochrome 69.3
Room 69.1
Tablecloth 67.6
Vintage clothing 67.6
Formal wear 66.9
History 65.8
Classic 65.3
Pattern 62.8
Canopy 56.8
Rectangle 55.8

Microsoft
created on 2022-01-23

text 99
person 82.3
clothing 80.2
old 46.2

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 98.8%
Calm 85.1%
Sad 6.3%
Surprised 2.8%
Confused 2.4%
Happy 1.4%
Disgusted 0.7%
Fear 0.7%
Angry 0.6%

AWS Rekognition

Age 22-30
Gender Female, 86.1%
Sad 85.9%
Happy 6%
Calm 4.9%
Confused 1.5%
Surprised 0.5%
Disgusted 0.5%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Calm 83.7%
Happy 7%
Sad 3.6%
Fear 1.7%
Angry 1.6%
Surprised 1.1%
Disgusted 0.8%
Confused 0.5%

AWS Rekognition

Age 40-48
Gender Male, 72.8%
Calm 61.8%
Sad 19.4%
Confused 8.6%
Angry 4.9%
Surprised 3.7%
Disgusted 0.9%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 21-29
Gender Female, 57.8%
Calm 86.2%
Sad 5.8%
Happy 2.9%
Angry 1.8%
Confused 1.7%
Disgusted 0.7%
Surprised 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people standing in front of a crowd 79.3%
a group of people standing in front of a crowd of people 76.8%
a group of people standing in a room 76.7%

Text analysis

Amazon

21428
21428.
ЧИАНТРАЗ

Google

21428 21426. a 1428.
a
21426.
1428.
21428