Human Generated Data

Title

Untitled (woman seated on carnival booth ledge, man standing beside her)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4675

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated on carnival booth ledge, man standing beside her)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Clothing 97.3
Apparel 97.3
Person 95.5
Text 92.8
Face 92.5
Pants 87.5
Chair 86
Furniture 86
Outdoors 85.2
Nature 83
Shorts 78.4
Female 78
Person 73.7
Word 73.5
Photography 69.6
Photo 69.6
Alphabet 68.2
Portrait 68
Kid 63.8
Child 63.8
Girl 62.9
Shoe 58.3
Footwear 58.3
Vehicle 57
Transportation 57
Advertisement 56.9
Water 56.5
Snow 56.3

Imagga
created on 2021-12-14

flag 83.5
emblem 67.8
sign 40.6
symbol 15.5
building 13.4
structure 13.3
travel 12.7
road 12.6
transportation 12.6
shop 12.1
information 11.5
sky 11.5
sale 11.1
business 10.9
billboard 10.8
design 10.7
traffic 10.5
architecture 10.3
art 10
blackboard 9.7
success 9.7
urban 9.6
street 9.2
city 9.1
old 9.1
graphic 8.8
marketing 8.6
finance 8.4
decoration 8
high 7.8
empty 7.7
wall 7.7
grunge 7.7
clip 7.4
vintage 7.4
exterior 7.4
transport 7.3
drawing 7.2
mercantile establishment 7.1
to 7.1
country 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.3
person 82.3
clothing 81.7
player 74.3
black and white 63.1

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 73.2%
Calm 78.8%
Sad 11.4%
Happy 3.7%
Fear 2%
Confused 1.9%
Surprised 1.2%
Angry 0.8%
Disgusted 0.2%

AWS Rekognition

Age 15-27
Gender Female, 91.2%
Calm 71.4%
Surprised 12.2%
Happy 7.8%
Sad 2.6%
Fear 2.5%
Confused 1.9%
Angry 0.8%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 58.3%

Captions

Microsoft

a person holding a sign 60.1%
a group of people posing for a photo 52.8%
a person standing next to a sign 52.7%

Text analysis

Amazon

FAMOUS
See
34569
MINUTE
CHILLS
HELL'S
DEMONS
S
ER FAMOUS
DEMONS MUR
ER
rformance
GHS
MUR
GILS
Conti
HELL'S 1/2ACRD
O MINUTE
THRILLA
1/2ACRD
O
24

Google

ER
Conti
See
Conti formance HELL'S A See OHINUT ER FAMOUS S DEMONS LHRILAG CHILLS HS
A
S
CHILLS
formance
HELL'S
OHINUT
DEMONS
FAMOUS
LHRILAG
HS