Human Generated Data

Title

Untitled (group of debutantes)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19223

Human Generated Data

Title

Untitled (group of debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19223

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.8
Human 98.8
Person 97.3
Person 97.1
Person 97
Person 96.6
Person 96.1
Person 96.1
Person 95
Person 94.7
Clothing 94.5
Apparel 94.5
Person 88.7
Person 87.5
Crowd 83.8
People 83.2
Person 82.8
Person 81.4
Costume 77.6
Person 75.5
Person 74.8
Person 74.4
Person 71
Person 66.4
Female 64.8
Face 61.5
Text 59
Portrait 58.5
Photography 58.5
Photo 58.5
Room 58
Indoors 58

Clarifai
created on 2023-10-22

people 99.9
group 99
adult 97.8
group together 97.6
man 97.5
woman 97.1
many 95
wear 94.7
child 92.5
leader 90
administration 88.8
outfit 88.3
uniform 87
coat 86
veil 85.3
boy 84.5
military 84.3
adolescent 80.5
war 79.2
vehicle 78.4

Imagga
created on 2022-03-05

city 19.9
prison 19.9
building 17
people 16.7
world 16.6
correctional institution 16
kin 15.2
urban 14.8
old 14.6
man 14.1
adult 13.7
black 13.6
penal institution 12
street 12
window 10.6
walking 10.4
scene 10.4
business 10.3
antique 9.5
institution 9
outdoors 8.9
interior 8.8
passenger 8.7
sidewalk 8.7
architecture 8.6
men 8.6
statue 8.6
travel 8.4
stone 8.4
silhouette 8.3
dirty 8.1
history 8
portrait 7.8
person 7.7
winter 7.7
walk 7.6
child 7.6
spectator 7.5
traditional 7.5
vintage 7.4
life 7.4
male 7.3
shop 7.3
transportation 7.2
women 7.1
family 7.1
day 7.1

Google
created on 2022-03-05

Monochrome 70.1
Vintage clothing 68.2
Art 66.6
Monochrome photography 65.8
Team 65.7
Event 65.6
Room 65.3
History 65.3
Stock photography 63.6
Uniform 61
Crew 55.2
Child 54.7
Tree 53.6
Suit 53

Microsoft
created on 2022-03-05

outdoor 95.4
text 92.8
person 92.5
clothing 90.9
posing 78.7
group 78.5
line 25.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Male, 97.9%
Happy 35.5%
Calm 33.2%
Sad 21.3%
Disgusted 3%
Confused 2.6%
Angry 2.2%
Fear 1.2%
Surprised 1%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Calm 54.3%
Happy 38.1%
Surprised 3.1%
Angry 1.4%
Disgusted 1%
Sad 0.9%
Confused 0.8%
Fear 0.5%

AWS Rekognition

Age 36-44
Gender Female, 83.9%
Calm 47.8%
Confused 17.1%
Happy 13.3%
Sad 8.6%
Disgusted 8.4%
Angry 2%
Surprised 1.9%
Fear 0.9%

AWS Rekognition

Age 36-44
Gender Female, 88.6%
Happy 42.2%
Calm 35.7%
Confused 14.7%
Sad 5.2%
Disgusted 0.8%
Surprised 0.8%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 64.7%
Calm 90.3%
Happy 8.8%
Confused 0.3%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 94.5%
Happy 84.8%
Calm 14.1%
Sad 0.6%
Surprised 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 47-53
Gender Male, 97.1%
Happy 93.9%
Calm 4.3%
Surprised 0.5%
Sad 0.4%
Fear 0.3%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Person 98.8%
Person 97.3%
Person 97.1%
Person 97%
Person 96.6%
Person 96.1%
Person 96.1%
Person 95%
Person 94.7%
Person 88.7%
Person 87.5%
Person 82.8%
Person 81.4%
Person 75.5%
Person 74.8%
Person 74.4%
Person 71%
Person 66.4%

Categories

Text analysis

Amazon

D
ST
5
3
t
t.
و
200 t 3 Y . t >
.
.TH
XAOOX
m
>
& m . 3 t. 207
207
200
Y
XAGOX
&

Google

KODYK KODVK 2.TEEAAtirn
KODYK
KODVK
2.TEEAAtirn