Human Generated Data

Title

Untitled (band playing at Christmas ball with santa banner in background)

Date

1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9678

Human Generated Data

Title

Untitled (band playing at Christmas ball with santa banner in background)

People

Artist: Martin Schweig, American 20th century

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Furniture 98.1
Chair 98.1
Human 96
Person 96
Indoors 93.6
Interior Design 93.6
Person 90.5
Person 81.9
Chair 81.3
Musical Instrument 75.5
Accordion 69.5
Chair 66
Restaurant 61.6
Person 44.4

Imagga
created on 2022-01-23

sax 43.3
musical instrument 38.5
wind instrument 36.4
bass 34.6
music 31.7
man 28.9
guitar 27.3
person 26
male 25.5
musician 25.3
adult 22.8
instrument 22.3
brass 21.7
concert 21.4
rock 20.8
people 19.5
musical 19.1
play 18.1
black 18
studio 17.5
stage 17
player 17
singer 17
men 16.3
performer 15.8
professional 15.5
stringed instrument 14.8
entertainment 14.7
playing 14.6
accordion 14.5
performance 14.4
businessman 14.1
sound 14
business 14
style 13.3
art 12.4
device 12.3
teacher 12.2
hand 12.1
cornet 11.9
guitarist 11.8
melody 11.7
keyboard instrument 11.6
lifestyle 11.6
group 11.3
song 10.7
band 10.7
job 10.6
oboe 10.5
modern 10.5
horn 10.4
handsome 9.8
worker 9.8
indoors 9.7
star 9.4
silhouette 9.1
holding 9.1
portrait 9.1
bowed stringed instrument 9
jazz 8.8
artist 8.7
event 8.3
guy 8.3
happy 8.1
metal 8
computer 8
solo 7.9
work 7.8
audio 7.6
show 7.6
club 7.5
life 7.5
electric 7.5
equipment 7.3
microphone 7.1
room 7.1
happiness 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 93.4
person 87.4
concert 66.8
people 65.4
group 63.3

Face analysis

Amazon

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Sad 96.6%
Calm 1.5%
Confused 0.5%
Angry 0.5%
Happy 0.4%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 34-42
Gender Female, 76.2%
Calm 98.7%
Sad 0.6%
Happy 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Chair 98.1%
Person 96%

Captions

Microsoft

a group of people sitting at a table 93.9%
a group of people sitting around a table 93.6%
a group of people sitting in front of a window 86.6%

Text analysis

Amazon

E
KODAK-A-ITW
PMA8
210

Google

Y
T37A°2
MJI7-- Y T37A°2 --XAGON
MJI7--
--XAGON