Human Generated Data

Title

Untitled (large group of men marching)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19363

Human Generated Data

Title

Untitled (large group of men marching)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19363

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Shorts 99.8
Clothing 99.8
Apparel 99.8
Person 99.7
Human 99.7
Person 99.6
Person 98.5
Person 97.4
Person 97.3
Person 96
Person 93
Person 92.8
Performer 91.6
Person 82.2
Flooring 71.7
Female 69.6
Leisure Activities 65.1
Floor 63.2
People 62.3
Photography 60.8
Photo 60.8
Dance 60.7
Face 59.9
Sleeve 59.5
Dance Pose 58.4
Dress 56.1
Suit 55.5
Coat 55.5
Overcoat 55.5
Person 45.5

Clarifai
created on 2023-10-22

people 99.9
group together 98.2
wear 97.8
group 97.4
adult 97.3
man 96.6
many 95.9
woman 94.9
military 92.7
leader 90.9
outfit 89.7
uniform 89.1
dancing 87.9
music 81.6
recreation 78.4
war 77.8
musician 77.5
several 77
dancer 75.6
administration 74.8

Imagga
created on 2022-03-05

people 36.3
dancer 30.7
person 30.7
group 25.8
men 25.8
man 24.2
silhouette 23.2
women 22.1
adult 21.6
performer 21.5
male 21.3
athlete 21.3
runner 21.2
human 19.5
active 18.1
team 17.9
business 17
fashion 16.6
walking 16.1
success 16.1
crowd 15.4
businessman 15
entertainer 14.8
action 14.8
city 14.1
sport 13.7
dress 13.6
run 13.5
urban 13.1
couple 13.1
world 13
street 12.9
body 12.8
happy 12.5
together 12.3
black 11.7
dance 11.5
legs 11.3
boy 11.3
corporate 11.2
motion 11.1
competition 11
lifestyle 10.8
posing 10.7
running 10.6
life 10.5
portrait 10.4
work 10.2
happiness 10.2
girls 10
picket fence 10
fitness 9.9
clothing 9.8
fence 9.8
youth 9.4
teamwork 9.3
sidewalk 9.1
pretty 9.1
businesswoman 9.1
exercise 9.1
fun 9
outdoors 9
activity 9
style 8.9
wall 8.7
silhouettes 8.7
contestant 8.7
standing 8.7
move 8.6
child 8.6
casual 8.5
design 8.4
attractive 8.4
speed 8.2
lady 8.1
shadow 8.1
looking 8
hands 7.8
model 7.8
travel 7.7
track 7.7
career 7.6
friendship 7.5
leisure 7.5
occupation 7.3
sun 7.2
sexy 7.2
family 7.1
day 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.5
person 90.9
clothing 88
man 80.8
posing 71
footwear 64.2
white 63
black and white 60.2
group 56.7
clothes 24

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 98.3%
Calm 99%
Surprised 0.6%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 89.5%
Calm 94.1%
Disgusted 2.6%
Happy 0.7%
Surprised 0.7%
Confused 0.6%
Sad 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 85.2%
Sad 69.1%
Calm 21.7%
Disgusted 3.5%
Confused 2.3%
Happy 1.6%
Angry 1%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 23-33
Gender Male, 92.1%
Sad 46.7%
Calm 21.7%
Surprised 12.3%
Fear 8.3%
Confused 6.3%
Disgusted 2.3%
Angry 1.7%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.6%
Person 98.5%
Person 97.4%
Person 97.3%
Person 96%
Person 93%
Person 92.8%
Person 82.2%
Person 45.5%

Categories

Text analysis

Amazon

Y3RA-A