Human Generated Data

Title

Untitled (six groomsmen or ushers standing behind table with wedding cake)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2776

Human Generated Data

Title

Untitled (six groomsmen or ushers standing behind table with wedding cake)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2776

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.3
Human 99.3
Person 98.5
Person 98.1
Person 97.9
Person 97.7
Clothing 97
Overcoat 97
Coat 97
Apparel 97
Person 97
Person 96.4
Tabletop 95
Furniture 95
Person 91.9
Person 90.2
Chair 88
Tuxedo 84.4
Tablecloth 81.2
Table 80.8
People 77.3
Chair 69
Suit 68.1
Dining Table 63.4
Blazer 62.3
Jacket 62.3
Flower 61
Plant 61
Blossom 61
Photography 60.6
Photo 60.6
Crowd 57.5
Suit 56.7
Face 56

Clarifai
created on 2023-10-26

people 99.7
group 99.1
group together 97.4
man 97
leader 95.7
adult 95.6
woman 94.5
furniture 92.5
administration 90.3
monochrome 88.6
musician 87.9
chair 87.5
room 87.4
music 84.4
several 82
many 80.8
actor 79.9
indoors 76.7
ceremony 76.1
three 75.9

Imagga
created on 2022-01-16

man 35.6
male 34.7
people 34.6
person 27.4
kin 27.1
teacher 26.5
business 25.5
office 24.9
adult 24.6
men 24
businessman 23.8
group 22.6
smiling 22.4
professional 22.4
women 21.3
table 20.8
sitting 20.6
happy 20
meeting 19.8
musical instrument 19
room 18.3
indoors 17.6
percussion instrument 17.4
team 17
job 16.8
couple 16.5
cheerful 16.2
communication 15.9
classroom 15.9
work 15.7
educator 14.9
teamwork 14.8
smile 14.2
together 14
corporate 13.7
laptop 13.7
worker 13.7
businesswoman 13.6
marimba 13.2
blackboard 12.6
executive 12.5
holding 12.4
businesspeople 12.3
lifestyle 12.3
computer 12
chair 11.4
togetherness 11.3
education 11.2
two 11
colleagues 10.7
working 10.6
modern 10.5
desk 10.4
technology 10.4
friends 10.3
enjoyment 10.3
happiness 10.2
employee 10.1
color 10
board 9.9
suit 9.9
to 9.7
portrait 9.7
class 9.6
talking 9.5
student 9.5
career 9.5
mature 9.3
confident 9.1
leisure activity 8.8
home 8.8
full length 8.7
boy 8.7
love 8.7
finance 8.4
study 8.4
manager 8.4
school 8.3
indoor 8.2
looking 8
interior 8
black 7.9
casual clothing 7.8
students 7.8
two people 7.8
leader 7.7
drinking 7.6
workplace 7.6
learning 7.5
friendship 7.5
wine 7.4
successful 7.3
sibling 7.2
success 7.2

Google
created on 2022-01-16

Coat 91.3
Black-and-white 85.4
Style 83.9
Rectangle 83.4
Suit 81
Font 80.8
Monochrome 77.7
Monochrome photography 77.4
Table 74.7
Event 72.7
Room 71.8
Design 68.6
Art 67.3
Chair 63.6
Visual arts 62.9
History 58.4
Vintage clothing 58.3
Sitting 54.4
Photographic paper 50.8
Formal wear 50.8

Microsoft
created on 2022-01-16

person 98.9
clothing 93.5
posing 89.7
text 85.9
standing 84.8
black and white 81.7
black 81.3
music 81.2
group 75.3
man 72.2
old 45.3
clothes 16.3
concert band 12.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 89.9%
Happy 76.6%
Calm 10.6%
Sad 5.6%
Confused 4.1%
Surprised 1.4%
Fear 0.7%
Disgusted 0.7%
Angry 0.3%

AWS Rekognition

Age 23-33
Gender Male, 97.7%
Happy 51.7%
Sad 25.5%
Confused 9.4%
Disgusted 4.2%
Angry 3.2%
Calm 2.9%
Surprised 1.9%
Fear 1.1%

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Sad 47.7%
Happy 29.4%
Confused 8.4%
Surprised 3.8%
Disgusted 3.6%
Calm 2.9%
Angry 2.8%
Fear 1.4%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Confused 72%
Happy 11.1%
Sad 6.2%
Calm 5.6%
Surprised 1.6%
Disgusted 1.4%
Angry 1.1%
Fear 1%

AWS Rekognition

Age 48-56
Gender Male, 96.6%
Sad 76.6%
Calm 10.7%
Confused 3.3%
Happy 2.3%
Surprised 2.2%
Angry 1.8%
Disgusted 1.7%
Fear 1.4%

AWS Rekognition

Age 41-49
Gender Male, 93.9%
Sad 41.8%
Happy 16.1%
Calm 13.2%
Disgusted 10.3%
Fear 7.5%
Surprised 5.4%
Angry 4.2%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 88%
Suit 68.1%

Categories

Text analysis

Amazon

11
KODAK
SAFETY
،
PAIN

Google

"PILM 11
"PILM
11