Human Generated Data

Title

Faculty Tea

Date

1927

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1710

Human Generated Data

Title

Faculty Tea

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Date

1927

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1710

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.7
Human 99.7
Person 99.2
Person 98.7
Classroom 98.5
School 98.5
Room 98.5
Indoors 98.5
Person 97.8
Person 97.5
Person 96.2
Person 95.8
Chair 94.7
Furniture 94.7
Person 92.9
Person 89.6
Person 87.8
Text 86.5
Person 66
Living Room 60.5

Clarifai
created on 2023-10-26

people 100
group 99.5
adult 98.8
many 98.4
child 97.7
woman 97.4
education 97.2
man 96.3
group together 95.8
elementary school 95.4
furniture 94.3
room 93.9
sitting 93
war 92.8
classroom 92.5
school 92.1
boy 90.7
sit 90.1
teacher 87.9
wear 87.5

Imagga
created on 2022-01-14

brass 45.4
wind instrument 37.6
musical instrument 37.6
room 37.5
classroom 32.7
man 30.9
people 27.3
person 21.8
cornet 20.6
male 20.6
business 20
men 18.9
indoors 17.6
stringed instrument 16.4
interior 15
businessman 14.1
adult 14
couple 13.9
chair 13.9
office 13.8
indoor 13.7
women 13.4
work 13.3
teacher 13.3
computer 12.8
bowed stringed instrument 12.1
home 12
lifestyle 11.6
desk 11.4
black 11.4
table 11.3
meeting 11.3
group 11.3
sitting 11.2
professional 11.1
laptop 11.1
violin 11
device 10.9
job 10.6
love 10.2
inside 10.1
communication 10.1
businesswoman 10
team 9.8
modern 9.8
family 9.8
working 9.7
together 9.6
happiness 9.4
vintage 9.1
old 9
retro 9
suit 9
success 8.8
talking 8.5
smile 8.5
mature 8.4
phone 8.3
executive 8.3
student 8.3
silhouette 8.3
window 8.2
style 8.2
worker 8.1
handsome 8
antique 7.8
glass 7.8
elegant 7.7
casual 7.6
hand 7.6
relaxation 7.5
house 7.5
holding 7.4
occupation 7.3
new 7.3
portrait 7.1

Google
created on 2022-01-14

Suit 77.3
Chair 74.8
Event 72.8
Classic 72.6
Vintage clothing 72.3
Room 65.1
Curtain 64.8
History 64.1
Stock photography 62
Sitting 61.3
Monochrome 60.1
Art 59.5
Visual arts 56
Child 52.5
Retro style 51.5

Microsoft
created on 2022-01-14

text 99.5
indoor 98.1
wall 97.8
clothing 94.6
person 90.2
man 78.1
old 45.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 99.3%
Sad 97.2%
Calm 1.1%
Angry 0.6%
Confused 0.4%
Fear 0.3%
Disgusted 0.2%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 16-22
Gender Female, 54.3%
Calm 99.6%
Sad 0.2%
Fear 0.1%
Confused 0%
Surprised 0%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Calm 99.6%
Sad 0.3%
Angry 0%
Confused 0%
Fear 0%
Happy 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 21-29
Gender Female, 99.6%
Calm 97.5%
Sad 1.1%
Confused 0.4%
Surprised 0.4%
Angry 0.3%
Happy 0.3%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 43-51
Gender Female, 100%
Calm 97.1%
Happy 2.1%
Sad 0.5%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Female, 99.9%
Calm 95.2%
Sad 1.7%
Happy 1.2%
Confused 0.6%
Surprised 0.4%
Angry 0.4%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 96.4%
Sad 1.7%
Angry 0.4%
Confused 0.4%
Surprised 0.4%
Happy 0.3%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 42.3%
Sad 22.1%
Happy 10.7%
Confused 6.6%
Angry 6.4%
Disgusted 4.5%
Fear 3.9%
Surprised 3.4%

AWS Rekognition

Age 11-19
Gender Male, 75.6%
Calm 96.4%
Sad 3.3%
Fear 0.1%
Angry 0.1%
Confused 0%
Happy 0%
Surprised 0%
Disgusted 0%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 20
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 94.7%

Categories

Text analysis

Amazon

Faculty
Faculty Tea.1927
Whitesell
Whitesell 0
0
Tea.1927
¿

Google

Tea.
192Y
.
0.
Faculty Tea. 192Y . 0.
Faculty