Human Generated Data

Title

Untitled (group of women at table, social club)

Date

1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18193

Human Generated Data

Title

Untitled (group of women at table, social club)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99
Person 98.9
Person 98.7
Person 97.5
Home Decor 96
Restaurant 94
Furniture 93.8
Chair 93.2
Person 93.2
Overcoat 92.1
Clothing 92.1
Apparel 92.1
Suit 92.1
Coat 92.1
Food 91.9
Meal 91.9
Table 91.8
Room 89.7
Indoors 89.7
Dining Room 89.7
Sitting 87.7
Dining Table 85.6
Crowd 85.5
Tablecloth 83.1
Tabletop 82.3
Dish 79.8
Person 78.8
People 77.8
Face 76.6
Beverage 70.7
Drink 70.7
Portrait 70.3
Photography 70.3
Photo 70.3
Linen 68.7
Bar Counter 65.9
Pub 65.9
Dinner 61.1
Supper 61.1
Sunglasses 60.2
Accessories 60.2
Accessory 60.2
Party 58.1
Tuxedo 57.7
Alcohol 57.3
Musician 56.9
Musical Instrument 56.9
Diner 56.3
Cafeteria 55.2
Glass 55

Imagga
created on 2022-03-04

man 31.5
person 31.2
people 30.6
wheelchair 28
room 25.2
classroom 25.1
chair 24.8
nurse 23.1
male 22
men 21.4
adult 20.1
patient 20.1
business 16.4
sitting 16.3
indoors 15.8
seat 15.7
teacher 15.7
shop 14.7
barbershop 14.6
home 14.3
women 14.2
couple 13.9
smiling 13.7
case 13.7
lifestyle 13.7
group 13.7
hand 12.9
professional 12.8
happy 12.5
businessman 12.3
senior 12.2
black 12
sick person 11.7
team 11.6
child 11.5
salon 11.1
furniture 10.6
together 10.5
mercantile establishment 10.3
communication 10.1
to 9.7
interior 9.7
boy 9.6
love 9.5
meeting 9.4
inside 9.2
indoor 9.1
modern 9.1
cheerful 8.9
job 8.8
medical 8.8
30s 8.6
educator 8.1
office 8.1
work 7.8
happiness 7.8
two people 7.8
travel 7.7
two 7.6
student 7.6
mature 7.4
life 7.4
teamwork 7.4
girls 7.3
looking 7.2
place of business 7.1
blackboard 7.1
family 7.1
working 7.1

Google
created on 2022-03-04

Outerwear 95.3
Black 89.5
Coat 88.4
Table 87.3
Black-and-white 84.3
Style 83.9
Adaptation 79.3
Chair 78.1
Monochrome 77.1
Monochrome photography 74.9
Suit 74.3
Vintage clothing 74.2
Event 73.5
Classic 72.9
Room 72.4
Tablecloth 67.8
Art 65.6
History 65.3
Stock photography 62.5
Coffee table 60.9

Microsoft
created on 2022-03-04

person 99.1
text 94.9
clothing 94.4
group 91
man 84.5
people 79.8
posing 43.6

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 98%
Surprised 84.5%
Calm 11.1%
Sad 2.1%
Disgusted 0.7%
Fear 0.6%
Confused 0.4%
Happy 0.2%
Angry 0.2%

AWS Rekognition

Age 36-44
Gender Male, 99%
Happy 53.2%
Calm 35.5%
Sad 3.7%
Confused 2.9%
Surprised 2.7%
Disgusted 1%
Fear 0.6%
Angry 0.4%

AWS Rekognition

Age 48-56
Gender Male, 59.5%
Surprised 76.3%
Happy 19%
Calm 3%
Disgusted 0.6%
Confused 0.4%
Fear 0.3%
Sad 0.2%
Angry 0.2%

AWS Rekognition

Age 47-53
Gender Female, 83.1%
Calm 98.3%
Happy 0.8%
Sad 0.5%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 99.2%
Calm 72.3%
Sad 25.9%
Confused 0.6%
Happy 0.4%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 97.8%
Surprised 62.3%
Happy 33.6%
Calm 1%
Disgusted 0.8%
Confused 0.7%
Angry 0.6%
Fear 0.5%
Sad 0.5%

AWS Rekognition

Age 39-47
Gender Male, 95.1%
Calm 85.4%
Surprised 9.8%
Happy 3.7%
Confused 0.4%
Disgusted 0.3%
Angry 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Female, 64.8%
Calm 97.5%
Happy 2%
Surprised 0.3%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Sunglasses 60.2%

Captions

Microsoft

a group of people sitting posing for the camera 97.3%
a group of people sitting at a table 96.5%
a group of people sitting around a table 96.4%

Text analysis

Amazon

-
8AI
KODAKEEIA

Google

00
00