Human Generated Data

Title

Untitled (girls from dance class posing in a line)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18402

Human Generated Data

Title

Untitled (girls from dance class posing in a line)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Person 99.6
Person 99.2
Person 98.6
Person 98.2
Footwear 97.3
Clothing 97.3
Shoe 97.3
Apparel 97.3
Person 96.6
Shoe 95
Person 95
Person 94.7
Person 93.4
Sport 92.9
Sports 92.9
Leisure Activities 84.9
Dance Pose 84.9
Shoe 84.9
Skating 83.3
Person 70.2
Ice Skating 67
Female 60.9
Person 58.6
Person 56
Face 56

Imagga
created on 2022-03-04

room 25.6
group 25
classroom 22
people 20.6
table 19.1
chair 18.5
team 17
brass 16.4
crowd 16.3
man 16.1
person 16.1
indoors 15.8
bugle 14.9
business 14
wind instrument 13.4
interior 13.3
equipment 13.1
teacher 13
row 12.1
men 12
professional 11.2
support 10.8
steel 10.6
adult 10.5
work 10.2
teamwork 10.2
training 10.2
lifestyle 10.1
dinner 10.1
male 9.9
coat hanger 9.9
hanger 9.9
metal 9.7
musical instrument 9.6
seat 9.6
restaurant 9.5
party 9.5
modern 9.1
shoe shop 9.1
health 9
glass 9
businessman 8.8
shop 8.8
women 8.7
dancer 8.7
gym 8.6
dining 8.6
3d 8.5
casual 8.5
device 8.4
wood 8.3
event 8.3
educator 8.3
clothing 8.2
hall 8.1
weight 8.1
education 7.8
building 7.6
gymnasium 7.5
silhouette 7.4
floor 7.4
service 7.4
indoor 7.3
furniture 7.2
body 7.2

Google
created on 2022-03-04

Black 89.6
Gesture 85.3
Font 81.3
Team 73
Crew 72.4
Event 70.9
Monochrome photography 67.3
Metal 67
Monochrome 66.2
Room 64
Stock photography 62.2
Rectangle 56.8
Happy 54.5
Knee 50.9

Microsoft
created on 2022-03-04

person 97.5
footwear 97
wall 96.7
clothing 96.6
text 95.2
dance 94.6
woman 68.7
line 30.5

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Sad 45.4%
Surprised 23.1%
Calm 11.2%
Happy 7.7%
Fear 4.6%
Confused 3.6%
Angry 2.8%
Disgusted 1.6%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 59.5%
Sad 17.8%
Happy 11%
Angry 2.9%
Confused 2.9%
Surprised 2.7%
Disgusted 1.8%
Fear 1.3%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 64.1%
Surprised 31.3%
Sad 1.2%
Angry 1.1%
Happy 0.7%
Fear 0.6%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 41-49
Gender Female, 99.9%
Fear 46.9%
Happy 37.2%
Surprised 9.8%
Calm 3.3%
Sad 1.1%
Angry 0.9%
Disgusted 0.6%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 95.7%
Sad 2.9%
Angry 0.6%
Surprised 0.3%
Disgusted 0.2%
Confused 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.6%
Calm 67.8%
Surprised 12.6%
Happy 11.9%
Fear 3.2%
Angry 2.8%
Sad 0.8%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 31-41
Gender Male, 97.1%
Sad 31.8%
Calm 29%
Happy 10.6%
Angry 9.5%
Surprised 7.7%
Disgusted 4.7%
Fear 4.2%
Confused 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 97.3%

Captions

Microsoft

a group of people posing for a photo 64.5%
a group of people posing for the camera 64.4%
a group of people posing for a picture 64.3%

Text analysis

Amazon

FUNERAL
STORE
PYROFAX
FUNERAL DIRECTO
INSURANCE
DIRECTO
FURNITURE STORE
INSURANCE -
COOKS
FURNITURE
-
PYROFAX GMMAN Trit
Trit
IAD
FORES
GMMAN
FORES PILLSBURYHertracht Deal
DE
Deal
رحده
REFRIGERATURE
PILLSBURYHertracht
Partic
DES Partic
DES

Google

DIRECT
FOR HOES PILLSBUertranda ONTURE PYROFAX ERS TUNERAL DIRECT
PILLSBUertranda
TUNERAL
FOR
HOES
ONTURE
PYROFAX
ERS