Human Generated Data

Title

Untitled (group of debutantes)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19189

Human Generated Data

Title

Untitled (group of debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19189

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.6
Human 98.6
Clothing 98.6
Apparel 98.6
Person 98.2
Person 97.4
Person 95.8
Person 95.1
Person 94.1
Person 92
Person 91.6
Person 89.3
Person 87.3
Person 85.4
Person 80.6
Shoe 79
Footwear 79
Overcoat 74.5
Coat 74.5
Fashion 69.7
Robe 65.1
Person 61.3
Female 59.8
Flooring 59.7
Evening Dress 57.5
Gown 57.5
Cloak 56.8
Photography 55.5
Photo 55.5

Clarifai
created on 2023-10-22

people 99.9
group 98
woman 97.6
indoors 97.3
monochrome 97.3
man 97.3
adult 95.3
child 94.8
many 94.7
wedding 94.5
group together 94.2
wear 91.5
room 89.7
chair 89.5
leader 89.5
education 88.9
school 88.7
music 87.6
administration 85.6
portrait 84.8

Imagga
created on 2022-03-05

business 35.8
building 29.9
people 29.6
passenger 29.2
office 25.8
corporate 24.9
urban 23.6
man 22.8
city 22.4
men 21.5
adult 21
window 19.7
architecture 19.6
travel 19
businessman 18.5
group 17.7
gate 17.6
businesswoman 16.4
clothing 16.3
robe 15.9
women 15.8
black 15.7
male 15.7
prison 15.6
modern 15.4
garment 14.8
hall 14.1
silhouette 13.2
meeting 13.2
happy 13.2
correctional institution 12.6
work 12.6
transportation 12.5
team 12.5
professional 12.3
walking 12.3
success 12.1
person 12
airport 11.7
crowd 11.5
interior 11.5
indoors 11.4
wall 11.1
executive 11.1
corridor 10.8
entrance 10.6
life 10.6
door 10.5
scene 10.4
journey 10.4
manager 10.2
day 10.2
teamwork 10.2
suit 10.1
indoor 10
worker 9.9
departure 9.8
job 9.7
working 9.7
walk 9.5
businesspeople 9.5
penal institution 9.4
street 9.2
pretty 9.1
attractive 9.1
tourism 9.1
chair 8.9
sitting 8.6
metropolitan 8.4
tourist 8.3
fashion 8.3
inside 8.3
outdoors 8.2
subway 7.9
handshake 7.8
mall 7.8
glass 7.8
construction 7.7
old 7.7
stone 7.6
adults 7.6
career 7.6
communication 7.6
successful 7.3
time 7.3
reflection 7.3
transport 7.3
smiling 7.2
lifestyle 7.2
shop 7.1
portrait 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.6
clothing 91.3
black and white 90.1
person 88.4
line 21

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 91.3%
Calm 64.2%
Sad 22.9%
Happy 8.3%
Fear 1.6%
Disgusted 1.1%
Surprised 1%
Confused 0.6%
Angry 0.3%

AWS Rekognition

Age 37-45
Gender Male, 95.9%
Calm 53.1%
Happy 37.3%
Sad 5.4%
Surprised 1.8%
Confused 0.9%
Angry 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 21-29
Gender Female, 51.2%
Calm 90.4%
Fear 2.9%
Sad 2.4%
Confused 1.5%
Happy 1.1%
Disgusted 0.7%
Angry 0.5%
Surprised 0.4%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Sad 95.2%
Calm 1.7%
Happy 1.2%
Confused 1.1%
Disgusted 0.5%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Female, 97.3%
Calm 50%
Happy 45.9%
Surprised 1.5%
Sad 1.5%
Confused 0.4%
Fear 0.4%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Male, 89.9%
Calm 96.1%
Happy 1.6%
Sad 1.2%
Fear 0.5%
Confused 0.4%
Disgusted 0.2%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 35-43
Gender Male, 77.2%
Calm 98.9%
Happy 0.4%
Disgusted 0.2%
Confused 0.2%
Sad 0.2%
Fear 0.1%
Surprised 0.1%
Angry 0%

AWS Rekognition

Age 27-37
Gender Male, 99.2%
Confused 41.6%
Calm 20.6%
Happy 8.9%
Fear 7.5%
Surprised 7.2%
Sad 6.8%
Disgusted 4.6%
Angry 2.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 98.6%
Person 98.2%
Person 97.4%
Person 95.8%
Person 95.1%
Person 94.1%
Person 92%
Person 91.6%
Person 89.3%
Person 87.3%
Person 85.4%
Person 80.6%
Person 61.3%
Shoe 79%

Categories

Text analysis

Amazon

11
G 10
2.
MAGOM
٤١٢٣
YT37A2 XOOKS ٤١٢٣
YT37A2
XOOKS

Google

KODYK 2.v E EIA LITN KODYK 2. G 10 11
KODYK
2.v
E
EIA
LITN
2.
G
10
11