Human Generated Data

Title

Pray - Sin, New York

Date

1954

People

Artist: William Klein, American born 1928

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.599

Human Generated Data

Title

Pray - Sin, New York

People

Artist: William Klein, American born 1928

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Person 99
Person 97.4
Person 97.3
Person 97
Person 96.8
Crowd 96.4
Accessories 96.1
Sunglasses 96.1
Accessory 96.1
Parade 95.9
Protest 95.5
Person 88.9
Machine 88
Wheel 88
Car 84.9
Transportation 84.9
Vehicle 84.9
Automobile 84.9
Person 81.6
Performer 78.3
Clothing 77.8
Shoe 77.8
Footwear 77.8
Apparel 77.8
Person 70.8
Suit 70
Overcoat 70
Coat 70
Pedestrian 69.6
Person 68.3
People 68.1
Shoe 66.9
Car 64.8

Imagga
created on 2022-01-08

city 32.4
people 25.1
man 24.2
street 23.9
urban 22.7
architecture 19.6
men 18.9
building 17.8
male 17.7
pedestrian 17.3
wind instrument 17.2
business 17
adult 16.2
photographer 16.1
travel 15.5
musical instrument 15.5
person 14.8
brass 14.5
megaphone 14.2
walking 13.3
women 11.9
black 11.8
acoustic device 11.7
clothing 11.2
day 11
tourism 10.7
world 10.7
device 10.4
town 10.2
tourist 10
suit 10
group 9.7
crowd 9.6
cornet 9.3
window 9.2
silhouette 9.1
transportation 9
scene 8.7
corporate 8.6
walk 8.6
old 8.4
fashion 8.3
holding 8.3
sidewalk 8
businessman 7.9
boy 7.8
standing 7.8
outdoor 7.6
two 7.6
buildings 7.6
life 7.5
uniform 7.5
bag 7.4
historic 7.3
landmark 7.2
looking 7.2
briefcase 7.1
to 7.1
work 7.1

Google
created on 2022-01-08

Tire 95.7
Wheel 93.1
Motor vehicle 90.2
Vehicle 89.9
Black 89.5
Car 89
Building 87.2
Street fashion 86.7
Black-and-white 85.8
Mode of transport 84.8
Style 83.9
Font 81.6
Road 76.5
Monochrome photography 75.1
Window 73.7
City 73.7
Crowd 73.4
Monochrome 72.9
Pedestrian 71.6
Event 71.5

Microsoft
created on 2022-01-08

text 99.8
street 95.7
clothing 93.4
black and white 91.2
outdoor 88.9
person 85.9
car 85.5
vehicle 83.4
people 78.5
land vehicle 73.8
man 65.2
group 58
monochrome 55.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 93.6%
Confused 4.9%
Angry 0.7%
Disgusted 0.3%
Sad 0.2%
Fear 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.1%
Calm 98.8%
Confused 0.5%
Happy 0.2%
Sad 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Sad 61%
Angry 11.3%
Confused 7.2%
Fear 6.5%
Surprised 5.8%
Disgusted 4.8%
Calm 2.5%
Happy 0.7%

AWS Rekognition

Age 36-44
Gender Male, 96.7%
Angry 40%
Happy 24.5%
Calm 24.5%
Sad 3.8%
Fear 3.7%
Confused 1.8%
Disgusted 1%
Surprised 0.7%

AWS Rekognition

Age 45-53
Gender Male, 99.4%
Sad 47.2%
Calm 24%
Fear 12.8%
Angry 6.9%
Confused 3.2%
Disgusted 2.4%
Surprised 1.9%
Happy 1.7%

AWS Rekognition

Age 21-29
Gender Male, 98.5%
Calm 86.9%
Angry 6.1%
Surprised 2.3%
Sad 2.1%
Fear 0.8%
Happy 0.7%
Disgusted 0.7%
Confused 0.3%

AWS Rekognition

Age 12-20
Gender Female, 97.2%
Calm 81.6%
Surprised 7%
Angry 3.9%
Sad 3.1%
Confused 1.2%
Happy 1.2%
Fear 1.2%
Disgusted 0.8%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Sunglasses 96.1%
Wheel 88%
Car 84.9%
Shoe 77.8%
Suit 70%

Captions

Microsoft

a group of people standing in front of a store 92.1%
a group of people standing outside of a store 92%
a group of people standing in a store 90.5%

Text analysis

Amazon

THE
THAN
EVER
CLEAR
OF
KING
PLEASED
THIS
PRAY MORE THAN EVER
SIN
PRAY
PURE
AND
FOR
FATHER
NAME
TO
AMEN
CHRIST
WHAT
HEART
TO PRAY
MERCIFUL
TEACHING
broner
MORE
JESUS CHRIST AMEN
THOSE
KNOW
LIFE
ETERNAL LIFE
ETERNAL
YOUR
WILL
JESUS
THOU
DOM
AND A
DON'T
N
a
N G
BECAUSE
CREACHING
G
RIGHT
I AM CREACHING THE KING
WE
EST
THOU e PLEASED AND MERCIFUL
CATERING
DOM OF GOO AND TEACHING
HW
W save WE PURE HEART AND CLEAR
TO ASX GOD FOR
TIRED
CONSCIENCE
BEFORE AND GET YOUR HEART
THIS to TO YOU THAT DON'T KNOW HW
CONSCIENCE AND A CLEANGED MIND
TO ASX
CLEANGED
IN THE
BECAUSE "IS TIRED a
I AM
de WHAT
TO YOU
AIT THE
GOD
THIS 196 IN THE NAME OF OUR LORD
LOWE GOOD OUR HEAVENLY FATHER WILL
THE WADES OF - DEATH
DEATH
MIND
CARIST
WADES
GOO
SAVIDE
HEAVENLY
THAT
2
OUR
GET
kcal)
60015
SAVIDE 4505 CARIST
WITH THE ALMIGNTY GOO
to
LOWE
BEFORE
e
LORD
-
W
de
kcal) poole J
.KSFERN'S
WITH THE ALMIGNTY
save
GOOD
AIT
poole
J
Halbours
4505
"IS
CV.
196

Google

CONCN
DUR
M
PURE
MENT
MN
LRD
KSTERNS
WITH
SIN
tu
LOSO
AND
MERCIFA
NAME
WADE
all
p
MOE
THAN
ALM
SAVIORSS
THE
JESIIS
CHRIST
WH
e CONCN G0 DUR EVENY FER M THOUBE PLEASED AND MERCIFA VE WEA PURE MENT AN CNDENE ANDA CLEAN MN CATEDING ENDE NAME OF OA LRD ERG DRGT AMEN broner KSTERNS THE WADE is Mat e all p PRAY PEAY MOE THAN EVER REFORE AND 4T YOUR HART WITH THE ALM SAVIORSS CRKIS SIN LAM REACHING THE KN tu LOSO JESIIS CHRIST WH ALL CONIDENCEIN THE LORD DUYSTERNS
VE
WEA
AN
CATEDING
OF
OA
DRGT
Mat
e
PRAY
EVER
YOUR
LAM
DUYSTERNS
G0
EVENY
FER
PLEASED
CNDENE
ERG
is
REFORE
LORD
THOUBE
ANDA
CLEAN
ENDE
AMEN
broner
PEAY
4T
HART
CRKIS
REACHING
KN
ALL
CONIDENCEIN