Human Generated Data

Title

Untitled (group of debutantes)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19247

Human Generated Data

Title

Untitled (group of debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19247

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 98.5
Apparel 98.5
Person 98.2
Human 98.2
Person 98.1
Person 97.9
Person 97.5
Person 96.3
Person 95.8
Person 95.7
Person 92.9
Person 92.9
Person 91.3
Face 89.6
Person 89.1
Person 88.1
Person 87.4
Dress 87.2
Person 85.5
Person 85.5
Person 82.4
Female 81.3
People 81.1
Person 80.6
Crowd 77.1
Person 75.4
Girl 70.8
Kid 66.9
Child 66.9
Staircase 63
Portrait 62.5
Photography 62.5
Photo 62.5
Person 61.2
Costume 60.7
Shorts 59.8
Suit 59.7
Coat 59.7
Overcoat 59.7
Woman 57.5

Clarifai
created on 2023-10-22

people 99.9
group 99.3
group together 98.8
adult 98.5
many 98.3
man 98.3
woman 97.6
child 95.7
military 91.5
wear 90.8
leader 90.1
vehicle 89.8
administration 89.6
recreation 89.5
uniform 88.8
war 87.1
outfit 86.9
transportation system 85.6
boy 84.3
crowd 82.8

Imagga
created on 2022-03-05

people 24.5
man 18.2
person 17.1
city 13.3
world 13.1
photographer 13.1
male 13
sport 12.5
adult 12.4
brass 11.1
business 10.9
silhouette 10.7
group 10.5
black 9.9
travel 9.8
family 9.8
old 9.7
military 9.6
clothing 9.6
kin 9.3
bride 8.7
love 8.7
happiness 8.6
men 8.6
statue 8.5
child 8.4
portrait 8.4
outdoor 8.4
trombone 8.4
weapon 8.2
protection 8.2
new 8.1
history 8
life 8
interior 8
day 7.8
couple 7.8
uniform 7.7
war 7.6
dark 7.5
human 7.5
wind instrument 7.4
wedding 7.4
room 7.3
musical instrument 7.3
holiday 7.2
building 7.1
summer 7.1
architecture 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.5
person 95.9
clothing 94.8
posing 94.5
group 79.2
man 69.9
clothes 23.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.5%
Calm 83.1%
Sad 11.6%
Happy 2.4%
Surprised 1.1%
Confused 0.9%
Fear 0.5%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 30-40
Gender Male, 84.1%
Calm 46.9%
Happy 46.4%
Surprised 3%
Sad 1.3%
Confused 1.2%
Fear 0.5%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 45-53
Gender Male, 83.9%
Calm 99.3%
Happy 0.4%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Sad 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 24-34
Gender Male, 98.5%
Sad 59.5%
Calm 11.8%
Happy 9%
Confused 7.8%
Disgusted 4.6%
Angry 2.7%
Fear 2.5%
Surprised 2.1%

AWS Rekognition

Age 34-42
Gender Female, 81.5%
Sad 73.4%
Calm 21.5%
Confused 2.3%
Happy 0.9%
Disgusted 0.9%
Surprised 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 27-37
Gender Male, 95.2%
Calm 92.1%
Happy 6.1%
Surprised 0.4%
Disgusted 0.4%
Sad 0.4%
Confused 0.3%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 88.9%
Sad 40.7%
Confused 22%
Calm 16.9%
Happy 16.8%
Disgusted 1.8%
Surprised 1.1%
Angry 0.5%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Female, 66.5%
Calm 85.2%
Happy 13.5%
Confused 0.5%
Sad 0.4%
Disgusted 0.2%
Surprised 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 45-51
Gender Female, 71.7%
Calm 99.1%
Happy 0.2%
Sad 0.2%
Surprised 0.1%
Fear 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.2%
Person 98.1%
Person 97.9%
Person 97.5%
Person 96.3%
Person 95.8%
Person 95.7%
Person 92.9%
Person 92.9%
Person 91.3%
Person 89.1%
Person 88.1%
Person 87.4%
Person 85.5%
Person 85.5%
Person 82.4%
Person 80.6%
Person 75.4%
Person 61.2%

Categories

Text analysis

Amazon

2
E
8
MAGOM
YT37A2
MJI7 YT37A2
MJI7
MAGOX
MJ17 YT37A*

Google

MJI7 Y T37 A2 MAGOX MJIR Y T 33 A°2 MAGOX
MJI7
Y
T37
A2
MAGOX
MJIR
T
33
A°2