Human Generated Data

Title

Untitled (three women with guitar, ukelele, and banjo posing in field)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1896

Human Generated Data

Title

Untitled (three women with guitar, ukelele, and banjo posing in field)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1896

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.2
Person 98.6
Leisure Activities 98.1
Dress 95.9
Apparel 95.9
Clothing 95.9
Musical Instrument 94.7
Guitar 93.1
Musician 93
Viola 80.9
Fiddle 80.9
Violin 80.9
Lute 66.5
Girl 64.9
Female 64.9
Guitarist 61.5
Performer 61.5
Music Band 57.2
Hula 55.6
Toy 55.6

Clarifai
created on 2023-10-25

people 100
group 99.4
adult 97.7
child 97.5
man 96.2
three 96.2
boy 95.1
recreation 95.1
group together 94.8
monochrome 91.1
several 90.6
print 89.1
two 88.6
leader 87.1
woman 83.5
family 82.3
veil 81.3
four 80.8
interaction 80.6
sports equipment 79.9

Imagga
created on 2021-12-14

kin 100
people 25.1
man 23.5
sunset 19.8
mother 19.4
silhouette 17.4
child 16.6
happiness 16.4
person 16.4
love 15.8
outdoor 15.3
happy 15
adult 14.3
male 14.2
outdoors 14.2
joy 13.4
couple 13.1
portrait 12.9
summer 12.9
sky 12.1
sun 12.1
parent 12.1
family 11.6
sibling 11.4
boy 11.3
fun 11.2
lifestyle 10.8
park 10.7
play 10.3
dress 9.9
autumn 9.7
black 9.6
youth 9.4
girls 9.1
life 9.1
sport 9.1
human 9
lady 8.9
women 8.7
day 8.6
face 8.5
relax 8.4
hand 8.3
dark 8.3
fashion 8.3
active 8.1
world 8
together 7.9
bride 7.7
married 7.7
old 7.7
two 7.6
beach 7.6
walking 7.6
field 7.5
free 7.5
relationship 7.5
leisure 7.5
holding 7.4
style 7.4
vacation 7.4
water 7.3
sexy 7.2
romance 7.1
grass 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.7
clothing 87.6
outdoor 87.5
person 85.1
dress 82.8
standing 75.6
posing 75
drawing 71
woman 67.4
painting 57
sketch 53.1
old 40.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-44
Gender Female, 79.4%
Happy 45.6%
Calm 38.5%
Sad 11.9%
Confused 1.5%
Angry 1.2%
Fear 0.5%
Surprised 0.5%
Disgusted 0.4%

AWS Rekognition

Age 36-52
Gender Male, 70.1%
Calm 55%
Sad 17.6%
Confused 13.6%
Angry 6.9%
Happy 3.2%
Surprised 2.6%
Disgusted 0.6%
Fear 0.5%

AWS Rekognition

Age 41-59
Gender Male, 96.3%
Calm 82.3%
Sad 16.1%
Confused 0.7%
Happy 0.3%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Guitar 93.1%

Categories

Imagga

paintings art 99.1%