Human Generated Data

Title

Untitled (family seated at dinner table)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3402

Human Generated Data

Title

Untitled (family seated at dinner table)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3402

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99.2
Person 98.8
Person 98.4
Person 97.6
Person 96.5
Sitting 91.7
Crowd 85
Indoors 70.5
Room 66.4
Text 61
Electronics 56.4
Screen 56.4
Press Conference 55.7

Clarifai
created on 2023-10-26

people 99.9
group 99.3
adult 97.8
woman 96.4
man 96.2
group together 95.9
administration 95.8
child 95.2
education 95.1
leader 93.4
several 92.3
sitting 91.3
many 91.2
sit 90.8
war 90.6
furniture 90.5
family 87.2
three 85.1
school 84
room 83.4

Imagga
created on 2022-01-22

television 56.2
people 27.9
man 24.8
monitor 23.7
broadcasting 23.1
person 22.9
telecommunication system 21
office 19
computer 18.2
room 17.5
telecommunication 17.2
sitting 17.2
adult 17.1
desk 16.9
male 16.3
business 15.8
happy 15.7
home 15.1
indoors 14.9
indoor 14.6
smiling 14.5
working 14.1
electronic equipment 13.8
equipment 13.4
classroom 13.2
education 13
women 12.6
work 12.5
window 12.5
group 12.1
portrait 11.6
medium 11.4
smile 11.4
together 11.4
one 11.2
looking 11.2
interior 10.6
modern 10.5
blackboard 10.3
lifestyle 10.1
businessman 9.7
table 9.7
technology 9.6
senior 9.4
mature 9.3
chair 9.3
communication 9.2
pretty 9.1
laptop 9
team 9
love 8.7
house 8.3
school 8.1
child 8
hair 7.9
couple 7.8
happiness 7.8
model 7.8
corporate 7.7
class 7.7
teacher 7.7
talking 7.6
college 7.6
meeting 7.5
fun 7.5
style 7.4
teamwork 7.4
inside 7.4
student 7.3
back 7.3
book 7.3
lady 7.3
professional 7.3
black 7.2
worker 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.4
person 94
window 85.5
old 80.5
clothing 75.9
man 75.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Calm 91.7%
Sad 7.1%
Confused 0.5%
Disgusted 0.2%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.4%
Calm 72.2%
Sad 25.9%
Confused 0.9%
Angry 0.3%
Happy 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 53-61
Gender Female, 58.5%
Calm 99.1%
Sad 0.3%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories