Human Generated Data

Title

Untitled (two women ar a carnival booth)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7735

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women ar a carnival booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7735

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.6
Human 97.6
Person 94.3
Clothing 91
Apparel 91
Guitar 85.7
Musical Instrument 85.7
Leisure Activities 85.7
Performer 83.8
Musician 81.1
Face 78.2
Female 72.7
Portrait 69.8
Photography 69.8
Photo 69.8
Guitarist 65.2
Shorts 64.2
Text 62.7
Outdoors 61.7
Art 60.3
Girl 60
Drawing 59.3

Clarifai
created on 2023-10-25

people 99.8
adult 98.3
woman 97.4
group together 96.5
group 95.8
man 94.3
monochrome 92.8
administration 92.1
child 92
several 91.3
wear 91
recreation 90.1
three 89.7
two 88.1
leader 88
actress 85.5
facial expression 85.1
many 84.8
vehicle 83.7
four 81.8

Imagga
created on 2022-01-09

ship 17.9
vessel 17.8
pirate 17.7
architecture 17.5
building 17.4
people 16.7
travel 15.5
sky 14.7
adult 14.3
city 14.1
person 13.5
structure 13.3
flagpole 13.1
bridge 12.9
man 12.1
dress 11.7
device 11
staff 10.7
urban 10.5
support 10.3
construction 10.3
sea 10.2
equipment 10.1
column 10
statue 10
male 10
new 9.7
boat 9.7
high 9.5
leisure 9.1
business 9.1
stick 9.1
tourism 9.1
lifestyle 8.7
work 8.6
park 8.6
vacation 8.2
weapon 8.1
landmark 8.1
transportation 8.1
looking 8
clothing 7.8
standing 7.8
model 7.8
portrait 7.8
outside 7.7
old 7.7
outdoor 7.6
light 7.6
pole 7.6
fashion 7.5
craft 7.5
bow 7.5
water 7.3
industrial 7.3
cable 7.1
river 7.1
posing 7.1
night 7.1
vehicle 7.1
yacht 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.4
outdoor 90
person 87.5
playground 83.3
clothing 77.8
posing 75
black and white 72.3
ship 64.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 96.7%
Happy 65.2%
Calm 24.4%
Surprised 3.6%
Fear 3%
Sad 1.1%
Confused 1%
Disgusted 1%
Angry 0.8%

AWS Rekognition

Age 25-35
Gender Female, 92.8%
Calm 67.9%
Happy 27.1%
Sad 2.2%
Disgusted 0.8%
Confused 0.7%
Angry 0.6%
Surprised 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Guitar 85.7%

Categories

Text analysis

Amazon

Present
Past
FAMOUS
OSSIFIED
WORLD FAMOUS
WORLD
EWIS OSSIFIED MAN
EWIS
MAN
34557
OLLYWOOD
SHAMP
SIPA
from
WONDER APE S
HAGOR
ЗАЛЕ HAGOR
URNING & STONE
ЗАЛЕ

Google

RNING STONE Past Presat WORLD FAMOUS 34557
RNING
STONE
Past
Presat
WORLD
FAMOUS
34557