Human Generated Data

Title

Untitled (couple at carnival booth)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7737

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple at carnival booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7737

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99
Human 99
Clothing 97.1
Apparel 97.1
Person 93.6
Furniture 92.8
Chair 92.8
Face 91.6
Outdoors 86.6
Text 83.2
Female 82.4
Nature 82.2
Shorts 80.8
Sport 76
Sports 76
Portrait 74.1
Photography 74.1
Photo 74.1
Word 72.9
Shoe 71.7
Footwear 71.7
People 66.3
Girl 65.8
Building 65.2
Athlete 59.5
Running 59.1
Crowd 57.8
Vehicle 56.6
Transportation 56.6
Play 56.5
Woman 55.8

Clarifai
created on 2023-10-26

people 99.8
monochrome 98.7
adult 98.3
man 97.3
group together 95
street 92.3
woman 87.7
vertical 84.9
wear 84.1
administration 81.7
child 80.8
athlete 80.2
uniform 77.8
text 77.5
sport 75.9
group 75.9
recreation 75.7
many 74.5
vehicle 74.3
actor 74

Imagga
created on 2022-01-09

flag 76.7
emblem 63.7
sign 27.8
billboard 16.2
road 12.6
structure 11.8
signboard 11.8
traffic 11.4
symbol 10.8
silhouette 10.8
man 10.1
transportation 9.9
night 9.8
street 9.2
people 8.9
signs 8.7
construction 8.6
black 8.4
city 8.3
danger 8.2
male 7.8
person 7.7
arrow 7.7
sky 7.7
building 7.6
outdoors 7.5
safety 7.4
light 7.4
shop 7.3
design 7.3
graphic 7.3

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.1
sign 79.3
person 73.1
black and white 72.3
clothing 64
baseball 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 80.5%
Calm 77.5%
Sad 16.2%
Happy 1.9%
Angry 1.3%
Confused 0.8%
Disgusted 0.8%
Fear 0.8%
Surprised 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Shoe 71.7%

Text analysis

Amazon

34570
See
CHILLS
S
LLS
formance
S LLS
Contin
F
ER F
ER
INUTE
N
DEMO
AHdo
THRILLE
HELL'S
GSHS

Google

YT33A2 Contin Formance HELL'S ACRE Seey R F. SFC UHRIL CHILLS HS AMPS
YT33A2
Contin
Formance
HELL'S
ACRE
Seey
R
F.
SFC
UHRIL
CHILLS
HS
AMPS