Human Generated Data

Title

Untitled (people at table at party)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20213

Human Generated Data

Title

Untitled (people at table at party)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.6
Person 99.5
Human 99.5
Person 98.9
Person 98.7
Person 97.4
Clothing 96.8
Apparel 96.8
Person 95.7
Chair 94.3
Person 94.1
Person 92.3
Table 89.6
Tie 87.3
Accessory 87.3
Accessories 87.3
Restaurant 83.9
Tie 83
Person 80.6
Person 80.2
Glass 78.8
Person 78.1
Overcoat 77
Suit 77
Coat 77
Sitting 73.7
Chair 71.6
Food 67.7
Meal 67.7
Tie 65.2
Person 62.3
Plant 61.6
Tablecloth 59.6
Shirt 59.4
Beverage 58.9
Drink 58.9
Chair 58.7
Clinic 57.8
Person 57.8
Room 57.3
Indoors 57.3
Dining Room 57.3
Pub 57
Bar Counter 57
Dining Table 51.1
Person 45.2

Imagga
created on 2022-03-05

man 47.8
person 45.1
male 41.9
people 35.7
adult 32.8
business 31
businessman 30.9
office 28.2
patient 27.5
professional 27.1
meeting 26.4
colleagues 25.3
working 24.8
table 24.3
businesswoman 23.7
businesspeople 22.8
team 22.4
sitting 22.4
smiling 21.7
happy 20.1
desk 19.9
men 19.8
room 19.5
talking 19
teacher 18.6
indoors 18.5
couple 18.3
worker 18.1
lab coat 18
work 17.3
doctor 16.9
clinic 16.8
mature 16.8
teamwork 16.7
hospital 16.6
20s 16.5
30s 16.4
corporate 16.3
group 16.1
coat 16.1
computer 16
senior 15.9
medical 15.9
together 15.8
portrait 14.9
indoor 14.6
40s 14.6
laptop 14.6
casual 14.4
smile 14.3
job 14.2
associates 13.8
coworkers 13.8
occupation 13.8
executive 13.7
home 13.6
suit 13.5
lifestyle 13
discussing 12.8
women 12.7
cheerful 12.2
health 11.8
color 11.7
discussion 11.7
life 11.6
holding 11.6
four 11.5
sick person 11.5
case 11.2
waiter 11.2
nurse 11
hairdresser 11
educator 10.5
looking 10.4
day 10.2
face 10
attractive 9.8
student 9.8
busy 9.6
happiness 9.4
camera 9.2
communication 9.2
confident 9.1
care 9.1
restaurant 8.9
business people 8.9
interior 8.9
forties 8.8
clothing 8.8
businessmen 8.8
thirties 8.8
cooperation 8.7
education 8.7
employee 8.6
elderly 8.6
twenties 8.6
ethnic 8.6
adults 8.5
modern 8.4
presentation 8.4
old 8.4
hand 8.4
document 8.2
handsome 8
boardroom 7.9
four people 7.9
collaboration 7.9
30 35 years 7.9
25 30 years 7.8
conference 7.8
mid adult 7.7
using 7.7
illness 7.6
sit 7.6
classroom 7.5
horizontal 7.5
drink 7.5
manager 7.5
treatment 7.4
successful 7.3
new 7.3
board 7.2
specialist 7.2
bright 7.2

Google
created on 2022-03-05

Outerwear 95.4
Shirt 94.9
Coat 90
Table 86
Black-and-white 85.1
Style 83.8
Chair 81.1
Adaptation 79.2
Monochrome photography 75.6
Monochrome 74.9
Suit 74.8
Event 73.6
Room 69
Vintage clothing 68.9
T-shirt 68.3
Art 67.7
Tablecloth 67.3
History 65.5
Motor vehicle 64.3
Hat 63.6

Microsoft
created on 2022-03-05

text 98.5
person 98.3
clothing 91.7
table 91.5
man 85.3
woman 63.1
group 57.5
people 56.3

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 61.1%
Calm 52.3%
Sad 40.6%
Happy 3.5%
Confused 1.7%
Disgusted 0.7%
Surprised 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Male, 100%
Happy 83.5%
Surprised 6.6%
Confused 3.7%
Fear 1.9%
Sad 1.5%
Disgusted 1.3%
Angry 1.2%
Calm 0.2%

AWS Rekognition

Age 20-28
Gender Male, 76.5%
Happy 25.8%
Surprised 23%
Sad 19.4%
Confused 15.6%
Fear 6.5%
Calm 6.4%
Disgusted 2.1%
Angry 1.2%

AWS Rekognition

Age 53-61
Gender Male, 99.8%
Happy 97.8%
Calm 0.9%
Surprised 0.5%
Confused 0.3%
Disgusted 0.1%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Sad 41.6%
Surprised 23%
Fear 16.3%
Happy 8.2%
Calm 3.8%
Confused 3%
Angry 2.3%
Disgusted 1.8%

AWS Rekognition

Age 39-47
Gender Male, 100%
Sad 57.4%
Surprised 17.7%
Confused 8.4%
Happy 6.8%
Calm 4.3%
Disgusted 2.8%
Fear 1.3%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 94.3%
Tie 87.3%
Dining Table 51.1%

Captions

Microsoft

a group of people sitting at a table 96%
a group of people standing around a table 95.9%
a group of people sitting around a table 95.7%

Text analysis

Amazon

28
VAGOY
EVEEIA