Human Generated Data

Title

Untitled (family playing music together)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17058

Human Generated Data

Title

Untitled (family playing music together)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17058

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.2
Human 98.2
Person 97.3
Person 97.1
Musical Instrument 96.1
Person 95
Person 92.2
Leisure Activities 86.4
Person 82.7
Accordion 76.6
Tie 71.4
Accessories 71.4
Accessory 71.4
Banjo 68.6
Lute 61.7
Musician 57.6

Clarifai
created on 2023-10-29

people 99.9
group 99.3
adult 99.1
woman 98.9
music 98.4
musician 97.5
man 97
monochrome 96.8
group together 95.9
instrument 95.6
child 94.5
three 93
administration 92.8
elderly 92.4
four 91.7
leader 91.6
many 91.3
facial expression 90.3
several 88.8
wear 85.2

Imagga
created on 2022-02-26

musical instrument 43
room 32.1
people 30.7
man 29.6
accordion 28.5
person 28.3
male 27.7
teacher 26.5
wind instrument 26.4
adult 25.9
salon 25.5
home 24.7
keyboard instrument 22.8
classroom 21.9
businessman 21.2
business 20
men 19.7
indoors 19.3
office 18.6
table 18.2
sitting 18
smiling 16.6
senior 15.9
interior 15.9
mature 14.9
professional 14.8
couple 13.9
educator 13.9
chair 13.5
women 13.4
family 13.3
meeting 13.2
happy 13.2
group 12.9
computer 12.8
indoor 12.8
worker 12.7
portrait 12.3
lifestyle 12.3
together 12.3
businesswoman 11.8
elderly 11.5
smile 11.4
cheerful 11.4
old 11.1
happiness 11
laptop 10.9
desk 10.4
work 10.3
confident 10
team 9.9
modern 9.8
job 9.7
working 9.7
executive 9.5
education 9.5
talking 9.5
businesspeople 9.5
corporate 9.4
two 9.3
clothing 9.3
communication 9.2
face 9.2
house 9.2
kin 9.1
mother 9
to 8.8
looking 8.8
teaching 8.8
retirement 8.6
concertina 8.5
study 8.4
manager 8.4
teamwork 8.3
holding 8.3
board 8.1
device 7.9
living room 7.8
standing 7.8
color 7.8
daughter 7.6
occupation 7.3
alone 7.3
lady 7.3
new 7.3
dress 7.2
free-reed instrument 7.2
handsome 7.1
day 7.1

Microsoft
created on 2022-02-26

person 97.9
text 91.4
indoor 86.6
clothing 71.6
accordion 37.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 11-19
Gender Female, 75.5%
Calm 82%
Sad 13.7%
Surprised 1.1%
Angry 0.8%
Happy 0.7%
Confused 0.7%
Disgusted 0.5%
Fear 0.5%

AWS Rekognition

Age 28-38
Gender Female, 94.7%
Calm 90.5%
Happy 5.2%
Confused 1.9%
Surprised 1%
Sad 0.4%
Angry 0.4%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Female, 89.1%
Calm 46.1%
Sad 32.4%
Happy 8.3%
Surprised 7.6%
Fear 2.9%
Confused 1.2%
Angry 0.9%
Disgusted 0.7%

AWS Rekognition

Age 36-44
Gender Female, 94.3%
Happy 91.5%
Calm 2.9%
Surprised 1.5%
Confused 1.2%
Fear 1%
Sad 0.9%
Disgusted 0.5%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Tie
Person 98.2%
Person 97.3%
Person 97.1%
Person 95%
Person 92.2%
Person 82.7%
Tie 71.4%

Categories

Imagga

interior objects 59.9%
people portraits 36.4%

Text analysis

Amazon

BOOK
NONFAT
DRY
NONFAT DRY MIL
The
INSTANT
MIL
-
ATTLE
in n BOOK
in
n
- Peace بسم -
manwal
بسم
Peace

Google

The ITTLE L BOOK INSTANT NONFAT DRY MIL
The
ITTLE
L
BOOK
INSTANT
NONFAT
DRY
MIL