Human Generated Data

Title

Untitled (woman seated on carnival booth ledge, man standing beside her)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4678

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated on carnival booth ledge, man standing beside her)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.9
Human 98.9
Person 98.2
Text 95.4
Outdoors 72
Symbol 70.9
Alphabet 70
Clothing 67.1
Apparel 67.1
Advertisement 66
Nature 65.4
Sign 63.8
Portrait 62.1
Face 62.1
Photo 62.1
Photography 62.1
People 60.2
Poster 55.9
Person 47.6

Imagga
created on 2021-12-14

sign 38.4
billboard 21.3
building 20.5
structure 20.1
flag 20
shop 17.6
business 17.6
signboard 16.5
architecture 15.8
emblem 14.7
transportation 14.3
office 14.1
city 14.1
road 13.6
traffic 13.3
ashcan 12.8
wall 12.1
construction 12
information 11.5
urban 11.4
design 11.3
mercantile establishment 11.1
exterior 11.1
sky 10.8
empty 10.3
bin 10.2
finance 10.1
container 10
entrance 9.7
success 9.7
technology 9.6
text 9.6
street 9.2
blackboard 9.1
old 9.1
signs 8.7
marketing 8.6
direction 8.6
bank 8.5
space 8.5
station 8.5
perspective 8.5
house 8.4
symbol 8.1
interior 8
high 7.8
glass 7.8
travel 7.7
modern 7.7
advertising 7.7
commercial 7.5
transport 7.3
place of business 7.1
hall 7.1
work 7.1
businessman 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.7
outdoor 89.4
person 76.1
clothing 76
black and white 70.3
billboard 59.3

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Female, 87.7%
Happy 84.1%
Calm 14.2%
Sad 1%
Confused 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 19-31
Gender Female, 69.7%
Sad 60%
Calm 32.7%
Fear 4.5%
Angry 0.9%
Happy 0.7%
Surprised 0.6%
Confused 0.5%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person holding a sign 54.6%
a person standing next to a sign 54.5%
a group of people posing for a photo in front of a sign 50%

Text analysis

Amazon

UGHS
See
FAMOUS
ILLS
CHILLS
34571
S
MUR
M
rformance
ST
Contins
OFMONS
THRIL
HELL'S
VT33A2
VAGOM

Google

Contin
"
EMONS
SFC
formance
Seex
ו
CHILL
CHAMPS
HELLS
ACRE
FAMOUS
Contin formance HELLS ACRE Seex ו " FAMOUS EMONS THRIKS CHILL UGHS SFC CHAMPS
THRIKS
UGHS