Human Generated Data

Title

Untitled (musicians in ballroom)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19242

Human Generated Data

Title

Untitled (musicians in ballroom)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19242

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.9
Human 98.9
Person 98.5
Poster 97.6
Advertisement 97.6
Person 96.4
Chair 92.3
Furniture 92.3
Machine 89.4
Crowd 85
Chair 70
Pump 68.7
Nature 66.2
Outdoors 65.3
Chair 64.6
Tripod 60.8
Gas Station 60.8
Gas Pump 56

Clarifai
created on 2023-10-22

people 99.8
adult 96.3
man 96.3
woman 95.7
group 93.6
chair 93
group together 92
wear 90.2
school 90.1
room 89.5
education 89.3
furniture 86.2
child 85.6
many 84.5
indoors 80.7
wait 79
seat 78.3
boy 78.3
recreation 77.6
audience 77

Imagga
created on 2022-02-25

sport 43
volleyball net 42.2
net 36.2
stick 31.8
people 26.8
game equipment 26.7
equipment 26.3
man 26.2
crutch 22.4
silhouette 21.5
male 21.3
ski 17.6
person 17.2
walking 17
staff 16.6
hockey stick 16.3
men 16.3
snow 16.1
winter 15.3
outdoors 14.9
active 13.8
adult 13.6
sports equipment 12.9
outdoor 12.2
travel 12
activity 11.6
walk 11.4
business 10.9
exercise 10.9
lifestyle 10.8
vacation 10.6
landscape 10.4
city 10
sunset 9.9
mountain 9.8
couple 9.6
building 9.5
women 9.5
cold 9.5
day 9.4
water 9.3
tripod 9.3
beach 9.3
leisure 9.1
fun 9
group 8.9
brass 8.5
summer 8.4
sky 8.3
speed 8.2
wind instrument 8.1
recreation 8.1
trombone 8
urban 7.9
black 7.8
attractive 7.7
crowd 7.7
old 7.7
skier 7.5
one 7.5
athlete 7.4
action 7.4
window 7.3
life 7.3
alone 7.3
sun 7.2
office 7.2
fitness 7.2
transportation 7.2
holiday 7.2
sea 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.8
person 88.1
clothing 79.7
furniture 70.5
people 66.6
black 66.3
man 66.1
chair 62.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 100%
Sad 99.8%
Confused 0.1%
Angry 0.1%
Calm 0%
Fear 0%
Disgusted 0%
Happy 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Poster
Chair
Person 98.9%
Person 98.5%
Person 96.4%
Poster 97.6%
Chair 92.3%
Chair 70%
Chair 64.6%

Categories

Text analysis

Amazon

133
trat

Google

144 /33
144
/33