Human Generated Data

Title

Untitled (people at banquet)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20278

Human Generated Data

Title

Untitled (people at banquet)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.5
Human 98.5
Person 97.5
Person 97.2
Person 94.4
Interior Design 93.6
Indoors 93.6
Tie 92.3
Accessories 92.3
Accessory 92.3
Person 91.2
Face 86.9
Person 86.1
Person 81.9
Person 80.1
Clothing 74.2
Apparel 74.2
Sitting 73.9
Person 72.7
Room 70
Person 69.4
Female 68.7
Person 68.1
People 67.8
Crowd 65.8
Overcoat 60.2
Suit 60.2
Coat 60.2
Girl 59.5
Table 58.9
Furniture 58.9
Bar Counter 57.4
Pub 57.4
Electronics 56.6
Screen 56.6
Monitor 56.3
Display 56.3
Floor 55.2

Imagga
created on 2022-03-05

man 39.6
percussion instrument 36.1
musical instrument 35.3
marimba 32.2
people 27.9
male 26.9
adult 25.1
sitting 24.9
business 23.7
classroom 22.2
office 21.5
businessman 21.2
table 21.2
smiling 21
person 20.5
happy 18.8
indoors 17.6
lifestyle 17.3
work 17.3
meeting 16.9
room 16.9
teacher 16.8
working 16.8
teamwork 16.7
together 16.6
professional 16.6
worker 16.5
group 16.1
job 15.9
women 15.8
corporate 15.5
men 15.4
smile 15
businesswoman 14.5
team 14.3
communication 14.3
desk 14.2
indoor 13.7
salon 13.4
chair 13.1
education 13
executive 12.9
laptop 12.9
computer 12.8
businesspeople 12.3
couple 12.2
two 11.8
happiness 11.7
looking 11.2
color 11.1
cheerful 10.6
talking 10.4
adults 10.4
love 10.2
mature 10.2
suit 9.9
modern 9.8
handsome 9.8
discussion 9.7
interior 9.7
colleagues 9.7
workplace 9.5
coffee 9.3
confident 9.1
student 9.1
technology 8.9
hall 8.8
mid adult 8.7
paper 8.6
career 8.5
togetherness 8.5
black 8.4
success 8
restaurant 7.9
20 24 years 7.9
conference 7.8
40s 7.8
two people 7.8
portrait 7.8
class 7.7
hand 7.6
employee 7.6
sit 7.6
side 7.5
enjoyment 7.5
building 7.5
silhouette 7.4
manager 7.4
holding 7.4
phone 7.4
stringed instrument 7.3
successful 7.3
school 7.3
device 7.2
board 7.2
to 7.1

Google
created on 2022-03-05

Black 89.5
Line 82.3
Font 79.2
Art 75.8
Rectangle 75.8
Tints and shades 75.5
Table 71.3
Chair 70.8
Monochrome photography 68.2
Suit 66.8
Event 66.2
Sitting 63.4
History 63.1
Stock photography 63
Monochrome 62.7
Room 62.4
Photo caption 59.6
Recreation 58.7
Visual arts 57.7
Team 57.7

Microsoft
created on 2022-03-05

text 99.7
person 96.4
ship 86.9
black and white 76.4
man 70.6
clothing 55.9

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 77.8%
Happy 85.8%
Calm 6.9%
Surprised 5.6%
Fear 0.7%
Sad 0.3%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 30-40
Gender Male, 97.9%
Happy 62.2%
Calm 15%
Surprised 7.5%
Confused 5.1%
Sad 4.3%
Fear 2.8%
Disgusted 2.3%
Angry 0.9%

AWS Rekognition

Age 51-59
Gender Male, 99.7%
Sad 49%
Calm 33.3%
Confused 10.7%
Happy 3.8%
Surprised 1.3%
Angry 0.8%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Happy 81.4%
Sad 9.8%
Calm 2.7%
Confused 2.6%
Surprised 1%
Fear 0.9%
Disgusted 0.9%
Angry 0.7%

AWS Rekognition

Age 33-41
Gender Female, 90.8%
Calm 69.5%
Happy 13.3%
Sad 11%
Surprised 2%
Fear 1.7%
Confused 1.3%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 87%
Surprised 3.9%
Fear 3.4%
Confused 2.3%
Disgusted 1.1%
Happy 1.1%
Sad 0.7%
Angry 0.5%

AWS Rekognition

Age 51-59
Gender Female, 58.5%
Sad 61.3%
Happy 21.6%
Calm 10.9%
Fear 2.4%
Confused 1.5%
Disgusted 1.1%
Angry 0.6%
Surprised 0.6%

AWS Rekognition

Age 24-34
Gender Male, 97%
Calm 53%
Sad 42.8%
Angry 1.1%
Fear 0.9%
Confused 0.9%
Surprised 0.7%
Disgusted 0.5%
Happy 0.2%

AWS Rekognition

Age 26-36
Gender Male, 97.8%
Calm 96.4%
Sad 1.9%
Surprised 0.4%
Fear 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Tie 92.3%

Captions

Microsoft

a group of people sitting on a bench 54.4%
a group of people sitting on a bench in front of a crowd 50%
a group of people in a room 49.9%

Text analysis

Amazon

28

Google

YT
7A2
28 YTヨ7A2
28