Human Generated Data

Title

Untitled (people at table at banquet)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20285

Human Generated Data

Title

Untitled (people at table at banquet)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Chair 99.7
Chair 99.5
Chair 99.1
Table 99.1
Restaurant 99
Person 98.3
Human 98.3
Person 96.9
Person 96.4
Person 96.1
Person 95.5
Person 94.7
Dining Room 94.5
Indoors 94.5
Room 94.5
Meal 94.4
Food 94.4
Person 92.9
Person 88.4
Pub 84.6
Bar Counter 84.6
Apparel 81.2
Clothing 81.2
Person 81
Dish 77.8
Dining Table 75.6
Beverage 67
Drink 67
Cafeteria 61.8
Cafe 61.3
Glass 60
Food Court 57.5
Person 41.6

Imagga
created on 2022-03-05

man 41
male 39
person 37.9
people 37.9
meeting 34.9
businessman 32.7
business 32.2
restaurant 31.3
room 29.2
team 27.8
businesswoman 27.3
colleagues 26.2
group 25.8
office 25.2
businesspeople 24.7
teamwork 23.2
adult 22.7
happy 22.6
table 22.5
men 22.3
building 21.9
desk 21.7
smiling 21
indoors 20.2
patient 20.2
together 20.1
work 20
corporate 19.8
cafeteria 19.7
classroom 19.7
working 19.4
worker 18.8
women 18.2
job 16.8
teacher 16.4
sitting 16.3
talking 16.2
home 15.9
40s 15.6
30s 15.4
professional 15
20s 14.7
presentation 14
couple 13.9
education 13.8
discussing 13.8
discussion 13.6
nurse 13.5
adults 13.2
executive 13.1
structure 13.1
mature 13
lifestyle 13
cheerful 13
hospital 13
successful 12.8
coworkers 12.8
case 12.7
suit 12.6
four 12.5
medical 12.4
color 12.2
indoor 11.9
communication 11.8
to 11.5
career 11.4
senior 11.2
sick person 11
employee 10.8
smile 10.7
interior 10.6
ethnic 10.5
success 10.5
computer 10.4
camera 10.2
student 10.1
board 10.1
laptop 10
hall 9.9
associates 9.8
modern 9.8
conference 9.8
cooperation 9.7
planning 9.6
angle 9.6
college 9.5
staff 9.5
doctor 9.4
friends 9.4
clothes 9.4
chair 9.3
company 9.3
confident 9.1
holding 9.1
portrait 9.1
care 9
clinic 9
health 9
life 9
collaboration 8.9
standing 8.7
mid adult 8.7
diversity 8.6
day 8.6
happiness 8.6
clothing 8.5
plan 8.5
manager 8.4
food 8
family 8
boardroom 7.9
four people 7.9
forties 7.8
diverse 7.8
thirties 7.8
workers 7.8
class 7.7
daytime 7.7
counter 7.7
meal 7.7
twenties 7.6
eating 7.6
educator 7.5
drink 7.5
waiter 7.4
occupation 7.3
friendly 7.3
planner 7.3
new 7.3
grandfather 7.2
father 7.2
child 7.2
dinner 7.1

Google
created on 2022-03-05

Table 95.2
Furniture 94.7
Chair 92.3
Coat 89.9
Human 89.5
Black-and-white 86.5
Style 83.9
Line 82
Suit 79.8
Adaptation 79.2
People 78.3
Beauty 75.2
Monochrome 74.5
Event 73.5
Monochrome photography 72.4
Recreation 71.8
Coffee table 69.9
Room 68.7
Sitting 67.9
Tableware 66.7

Microsoft
created on 2022-03-05

table 99
furniture 95.2
chair 95
person 95
text 92.5
clothing 92.2
black and white 91.7
man 84.6

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Sad 92.5%
Angry 2.7%
Calm 1.4%
Surprised 0.9%
Disgusted 0.8%
Happy 0.8%
Confused 0.7%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Female, 95.8%
Calm 98.9%
Sad 0.5%
Surprised 0.3%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 93.1%
Calm 45.2%
Surprised 24.4%
Happy 16.3%
Confused 4.6%
Sad 3.9%
Disgusted 2.5%
Fear 1.9%
Angry 1.2%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Calm 84.3%
Sad 8.4%
Confused 2.9%
Surprised 1.5%
Disgusted 1.1%
Happy 0.7%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 31-41
Gender Female, 99.3%
Calm 52%
Surprised 30.3%
Fear 10.3%
Sad 3.9%
Happy 1.2%
Confused 0.9%
Disgusted 0.8%
Angry 0.6%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 99%
Surprised 0.4%
Confused 0.3%
Sad 0.1%
Disgusted 0.1%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 97.5%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Chair 99.7%
Person 98.3%
Dining Table 75.6%

Captions

Microsoft

a group of people sitting at a table 89%
a group of people sitting around a table 88.3%
a group of people sitting on a table 80.7%

Text analysis

Amazon

P8
RODVK
УТАРАЗ